{ "cells": [ { "cell_type": "markdown", "metadata": {}, "source": [ "# COMPSCI 389: Introduction to Machine Learning\n", "# Data Processing Example\n", "\n", "The code below runs gradient descent to minimize the sample mean squared error when using a linear parametric model, with the second-degree (order) polynomial basis.\n", "\n", "#### Unlike before (notebook 15), this code applies the standardization preprocessing step to rescale the features.\n", "\n", "The first code block defines the various functions for this. I recommend skipping down to the next markdown block." ] }, { "cell_type": "code", "execution_count": 1, "metadata": {}, "outputs": [], "source": [ "import numpy as np\n", "import pandas as pd\n", "import matplotlib.pyplot as plt\n", "from sklearn.base import BaseEstimator\n", "from sklearn.model_selection import train_test_split\n", "\n", "####################################################################\n", "### NOTE: Below we added StandardScaler\n", "####################################################################\n", "from sklearn.preprocessing import PolynomialFeatures, StandardScaler\n", "\n", "# Function to calculate mean squared error (for evaluation)\n", "def mean_squared_error(predictions, labels):\n", " return np.mean((predictions - labels) ** 2)\n", "\n", "# Function to calculate gradients\n", "def compute_gradients(X, y, weights):\n", " predictions = X.dot(weights)\n", " errors = predictions - y\n", " return 2 / X.shape[0] * X.T.dot(errors)\n", "\n", "class PolynomialRegressionGD(BaseEstimator):\n", " def __init__(self, learning_rate, iterations=1000, polynomial_degree=2):\n", " self.learning_rate = learning_rate\n", " self.iterations = iterations\n", " self.polynomial_degree = polynomial_degree\n", "\n", " def fit(self, X, y):\n", " ####################################################################\n", " ### NOTE: The lines below are new - they apply standardization\n", " ####################################################################\n", " # Standardize features and store the scaler\n", " self.scaler_ = StandardScaler().fit(X)\n", " X_scaled = self.scaler_.transform(X)\n", " \n", " # Expand features into polynomial basis and store the transformer\n", " self.poly = PolynomialFeatures(degree=self.polynomial_degree)\n", " X_poly = self.poly.fit_transform(X_scaled) # Use standardized features\n", "\n", " # Get the number of features\n", " numFeatures = X_poly.shape[1]\n", "\n", " # Initialize weights and loss history\n", " self.weights = np.zeros(numFeatures)\n", " self.loss_history = []\n", "\n", " # Print the initial loss\n", " predictions = X_poly.dot(self.weights)\n", " loss = mean_squared_error(predictions, y)\n", " print(f\"Iteration 0/{self.iterations}, Loss: {loss:.4f}\")\n", "\n", " for i in range(1, self.iterations + 1):\n", " # Compute the gradient of the loss function\n", " gradients = compute_gradients(X_poly, y, self.weights)\n", "\n", " # Update the weights using gradient descent\n", " self.weights -= self.learning_rate * gradients\n", "\n", " # Compute, print, and store the resulting loss\n", " loss = mean_squared_error(X_poly.dot(self.weights), y)\n", " self.loss_history.append(loss)\n", " print(f\"Iteration {i}/{self.iterations}, Loss: {loss:.4f}\")\n", "\n", " return self\n", "\n", " def predict(self, X):\n", " ####################################################################\n", " ### NOTE: The line below is new - it applies standardization.\n", " ### NOTE: We don't call \"fit\" again! We want to use the same \n", " ### transformation used during training.\n", " ####################################################################\n", " # Standardize the input features using the stored scaler\n", " X_scaled = self.scaler_.transform(X)\n", " # Transform standardized features into the polynomial basis\n", " X_poly = self.poly.transform(X_scaled)\n", " return X_poly.dot(self.weights)\n", "\n", "# Load the data set\n", "df = pd.read_csv(\"data/GPA.csv\", delimiter=',')\n", "\n", "# Split the data into features and labels\n", "X = df.iloc[:, :-1]\n", "y = df.iloc[:, -1]\n", "\n", "# Split the data into training and testing sets\n", "X_train, X_test, y_train, y_test = train_test_split(X, y, test_size=0.4, shuffle=True)\n", "\n", "def run(alpha):\n", " iterations = 1000\n", " polynomial_degree = 2\n", "\n", " # Initialize and fit the model\n", " model = PolynomialRegressionGD(\n", " learning_rate=alpha,\n", " iterations=iterations,\n", " polynomial_degree=polynomial_degree\n", " )\n", " model.fit(X_train, y_train)\n", "\n", " # Plotting the loss over iterations\n", " plt.plot(range(1, iterations + 1), model.loss_history)\n", " plt.xlabel('Iterations')\n", " plt.ylabel('Mean Squared Error')\n", " plt.yscale('log')\n", " plt.title(f'Gradient Descent Loss, Polynomial Degree: {polynomial_degree}')\n", " plt.show()\n", "\n", " # Predict on the test set\n", " predictions = model.predict(X_test)\n", "\n", " # Calculate MSE on the test set\n", " mse_test = mean_squared_error(predictions, y_test)\n", " print(f\"Test MSE: {mse_test:.4f}\")\n", "\n", " # Calculate the standard error of the MSE\n", " squared_errors = (predictions - y_test) ** 2\n", " std_error = np.std(squared_errors) / np.sqrt(len(squared_errors))\n", " print(f\"Standard Error of MSE: {std_error:.4f}\")\n" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "The `run` function takes the step size (learning rate) `alpha` as its one argument. It then runs 1,000 iterations of gradient descent on the GPA data set using the second-degree polynomial basis. Let's recreate the plot from the last lecture!\n", "\n", "Try to find a value for `alpha` that is effective, starting with 0.1. Remember, running the code may result in errors when the loss because `inf` or `nan`." ] }, { "cell_type": "code", "execution_count": 2, "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "Iteration 0/1000, Loss: 8.4534\n", "Iteration 1/1000, Loss: 147.5583\n", "Iteration 2/1000, Loss: 5331.8391\n", "Iteration 3/1000, Loss: 195359.9667\n", "Iteration 4/1000, Loss: 7160187.5536\n", "Iteration 5/1000, Loss: 262431763.8393\n", "Iteration 6/1000, Loss: 9618525052.3864\n", "Iteration 7/1000, Loss: 352533638708.9949\n", "Iteration 8/1000, Loss: 12920896474210.9355\n", "Iteration 9/1000, Loss: 473570596863278.8750\n", "Iteration 10/1000, Loss: 17357085915910218.0000\n", "Iteration 11/1000, Loss: 636163717696490496.0000\n", "Iteration 12/1000, Loss: 23316372210985693184.0000\n", "Iteration 13/1000, Loss: 854580665256674197504.0000\n", "Iteration 14/1000, Loss: 31321687045570891350016.0000\n", "Iteration 15/1000, Loss: 1147987684797460942684160.0000\n", "Iteration 16/1000, Loss: 42075502591198757006606336.0000\n", "Iteration 17/1000, Loss: 1542131454671759051960877056.0000\n", "Iteration 18/1000, Loss: 56521473946350318250272751616.0000\n", "Iteration 19/1000, Loss: 2071598375994438236507067121664.0000\n", "Iteration 20/1000, Loss: 75927245554428887294718605524992.0000\n", "Iteration 21/1000, Loss: 2782849554376180222386328492310528.0000\n", "Iteration 22/1000, Loss: 101995687921277104086616393046294528.0000\n", "Iteration 23/1000, Loss: 3738297795572558883057869676068470784.0000\n", "Iteration 24/1000, Loss: 137014325734719467185546507509919907840.0000\n", "Iteration 25/1000, Loss: 5021784374367784703031473193697769684992.0000\n", "Iteration 26/1000, Loss: 184056069811786927697016426842806378561536.0000\n", "Iteration 27/1000, Loss: 6745936167127103697956057953856968470822912.0000\n", "Iteration 28/1000, Loss: 247248867247295746597646849156643538163728384.0000\n", "Iteration 29/1000, Loss: 9062048741724933430263335615699669522225037312.0000\n", "Iteration 30/1000, Loss: 332137931759509948152536734373997722066604785664.0000\n", "Iteration 31/1000, Loss: 12173362653144001134264518083258441941000480882688.0000\n", "Iteration 32/1000, Loss: 446172340207926396532314165439057693862397556555776.0000\n", "Iteration 33/1000, Loss: 16352897949294567576788019343258916242747831087005696.0000\n", "Iteration 34/1000, Loss: 599358694479850483957467271169286768701353992424259584.0000\n", "Iteration 35/1000, Loss: 21967411877849292963799002396568323492048494389033435136.0000\n", "Iteration 36/1000, Loss: 805139208049475347155071239382976997430192562011436482560.0000\n", "Iteration 37/1000, Loss: 29509582100210640941676119078380420364595380462975134466048.0000\n", "Iteration 38/1000, Loss: 1081571269940640346054516727415875278113344150224771521445888.0000\n", "Iteration 39/1000, Loss: 39641239512932965502503470511681385040210502136652462594981888.0000\n", "Iteration 40/1000, Loss: 1452911993684855995033281348004047847189195205829283612079423488.0000\n", "Iteration 41/1000, Loss: 53251444387973915566527328310240633475886458232373282866544508928.0000\n", "Iteration 42/1000, Loss: 1951746796592663198918627303316914813096704824664734549557448278016.0000\n", "Iteration 43/1000, Loss: 71534502055121815179042269845970973007547453595798740390155326062592.0000\n", "Iteration 44/1000, Loss: 2621848793710201932582798067983198678544509636985533362997191052886016.0000\n", "Iteration 45/1000, Loss: 96094764059204948236853665618127252822322715661927322339334410163191808.0000\n", "Iteration 46/1000, Loss: 3522019920350501714906077426892890715715865142689436923395700821938143232.0000\n", "Iteration 47/1000, Loss: 129087411169490370009692222824464964178793491654509680315752283541306605568.0000\n", "Iteration 48/1000, Loss: 4731250844482093269792224470319452466585687166249561385227956559550309466112.0000\n", "Iteration 49/1000, Loss: 173407572052255462809659022434374511195546911228499119128778908540824346165248.0000\n", "Iteration 50/1000, Loss: 6355652455022135873645851590265120187087497804961871440187795680432312320786432.0000\n", "Iteration 51/1000, Loss: 232944372906947114538685407595085385707413960970363921725902110445253810576162816.0000\n", "Iteration 52/1000, Loss: 8537767169149251454122739174565265227474884170899274830696396495721787114951540736.0000\n", "Iteration 53/1000, Loss: 312922211105400485765852351212102109949106950210244125457515022926031544271736143872.0000\n", "Iteration 54/1000, Loss: 11469077132592981790577688883890513392029176055812486836352886746966412606153984835584.0000\n", "Iteration 55/1000, Loss: 420359199842996230645968302284438332305200010220233261140406717877913463258795305074688.0000\n", "Iteration 56/1000, Loss: 15406806916529513765779235097264887095389870955078804115102974721152660018529277615013888.0000\n", "Iteration 57/1000, Loss: 564683012651748666321111307650399120427261014727741285057772956393546253259211228691038208.0000\n", "Iteration 58/1000, Loss: 20696495159899232914845867369838892972468821589840309840032009625809596171063158372185407488.0000\n", "Iteration 59/1000, Loss: 758558168577139849250542433616586410479394156428225568197293753168229573670472651028075380736.0000\n", "Iteration 60/1000, Loss: 27802315835098430254767195243035009595214988866149023601551851265917828498386804556400204709888.0000\n", "Iteration 61/1000, Loss: 1018997352891809925163792593504817328286687641015405000456428235665165743123065795156855515774976.0000\n", "Iteration 62/1000, Loss: 37347809850058109694422303884111298293584580457097481766973082708280261319200135852652552662286336.0000\n", "Iteration 63/1000, Loss: 1368854292543186418029824923515251697121623934792612819570133264708809892135098509936238104265359360.0000\n", "Iteration 64/1000, Loss: 50170601214276852608380056423327812530209896066911026163554038312039124883236253741566637499924611072.0000\n", "Iteration 65/1000, Loss: 1838829187236219768469209706814120932935501107723365801139376094370149789263822729592551196824357044224.0000\n", "Iteration 66/1000, Loss: 67395899151984177681609794049515287255369248834287409843386730834313869079418438277715279094884388044800.0000\n", "Iteration 67/1000, Loss: 2470162674180416013209972634293816060501242886420470267460910886380742411019351627389054547912049872076800.0000\n", "Iteration 68/1000, Loss: 90535236026070661957593391864983936887252190325485707649314017331904028912052105827424644341159486900142080.0000\n", "Iteration 69/1000, Loss: 3318254723857776461946434260789501789554916784068754345086701585894054857075860783832359349922815263434080256.0000\n", "Iteration 70/1000, Loss: 121619105397082715009895450730975631907995098695842670614757234328741013067599611038313580894367281482706190336.0000\n", "Iteration 71/1000, Loss: 4457526027534914140390549875947684290489736970704754766103042410490543513418156575966432272654327111430694764544.0000\n", "Iteration 72/1000, Loss: 163375139302971715390736536329731143783689058821207033978495179070127812410528087017591901346286533344454426755072.0000\n", "Iteration 73/1000, Loss: 5987948466792513989357836011933095655392357610228372775955869014399509855660400706744736762998052420275494440665088.0000\n", "Iteration 74/1000, Loss: 219467459944871943602322804524170431562131145141516473033754177745275018575949416652865900788787418817848977450336256.0000\n", "Iteration 75/1000, Loss: 8043817718500571470051541970231640009073454121038602290059523909209033756553850810505154098465116023324344919315709952.0000\n", "Iteration 76/1000, Loss: 294818209062594094076027384596783081099447352247510895672212769879616475312736592050013498735244719052544171496072806400.0000\n", "Iteration 77/1000, Loss: 10805537797676200710082000821829858643275311783063430124013749552886371324946420191043361400417891796635074496913439981568.0000\n", "Iteration 78/1000, Loss: 396039469435279359800468845735186176495128302715006471612863903022829451422177261908458974839802744740535893117658063699968.0000\n", "Iteration 79/1000, Loss: 14515451640390222332231120700066828798845945645717117280161054749266967200246152280808561178626519006584138934009715700531200.0000\n", "Iteration 80/1000, Loss: 532013479931548519991843080481781107529770962423171937587551844393246286902835478474426262156933053680860355313220231295401984.0000\n", "Iteration 81/1000, Loss: 19499106871831876385520699201363263355231730331491195992800815916028674954003032207737794315031687190575368510935762666482827264.0000\n", "Iteration 82/1000, Loss: 714672058399801207647003469167467722573322421818406944603446886774363607745724092641637233843286217406839223371738311031355604992.0000\n", "Iteration 83/1000, Loss: 26193822846073004849582633184298191140231211416773104950236658513667497367365039579525657081993851199547644843510836843462214549504.0000\n", "Iteration 84/1000, Loss: 960043627321483086740571653182445910785475124532641612976912878445605458464529741840948695729842527363749413605937884130918931628032.0000\n", "Iteration 85/1000, Loss: 35187065735949661651428714783955029393162252027974059617360928269591903477113292632616350309606394633739726334362809228119132593979392.0000\n", "Iteration 86/1000, Loss: 1289659719486308830195760797021720716100696415446828427267780340806290866513439800510194608252871959079401897525908503750047250687983616.0000\n", "Iteration 87/1000, Loss: 47267999114976973696202257816920265608390402415175131901441435816143090363250332195151150737819204924616737517935882766818270427486355456.0000\n", "Iteration 88/1000, Loss: 1732444385580566444028736618672409179996803455711662693125143997045261797537532429637898437735896168387332681878339907933926819406857371648.0000\n", "Iteration 89/1000, Loss: 63496733632176060868037455484664793174206032899337558951005091243968485567669742515542221083998699393373646121225244689437398108960318291968.0000\n", "Iteration 90/1000, Loss: 2327252300572058131583020008068498190507591392149710355751955745520931114134521098079857410283584083039998079113400147349839953628956305915904.0000\n", "Iteration 91/1000, Loss: 85297352488906659637881754197380846993426101938191471154879441544622342486509027431005231578306026986336645308502510771743566949775526670106624.0000\n", "Iteration 92/1000, Loss: 3126278289564211874669347493897174964999025983379103904369453203023731622090939900841807043067939864367686649069846917203736667139569874106318848.0000\n", "Iteration 93/1000, Loss: 114582875770635910901379684433661596706231080026574159038177546788144586018393991798697954738056442858633513965678161009647571766846004676094066688.0000\n", "Iteration 94/1000, Loss: 4199637461481120315119070302626905660883989243209073064491330926271466773440010282071932425468053800271759731561165409502772216906988996359918452736.0000\n", "Iteration 95/1000, Loss: 153923129344213900558789326657713222255624560074027269711309924230463659112601334327828864749033628853156812817556023384507533263038458370075733786624.0000\n", "Iteration 96/1000, Loss: 5641517860629767866297899019898513251501574828711770855284319827938037624352708027541210396139992195632965612419920400444946644375556336352933533712384.0000\n", "Iteration 97/1000, Loss: 206770248937906448223561352438245877647138200832665311957097532965642896102456192141956245762253930625560924189676789313723612618682610169370325978972160.0000\n", "Iteration 98/1000, Loss: 7578445535767768927776333648328316337398196844182419711990514049859946121536949782646769847882219418445074695109224527703247220122300671176337731446898688.0000\n", "Iteration 99/1000, Loss: 277761607550444222069778421280936515403184749810985049190443555938855612852592176246745202503891205416921765464539458375901227684710904591196021367951589376.0000\n", "Iteration 100/1000, Loss: 10180387292470106102473931765080191346356530244283905947079871649943387858505092172910164234115784204720776411236958383286509063113985733005734193205761540096.0000\n", "Iteration 101/1000, Loss: 373126748288511078939617373086717548970753314287111872675450145126414877635120674408552269865275276370395173903268975955102961545886341935559056972941925285888.0000\n", "Iteration 102/1000, Loss: 13675665403352013729708131781811139757212247229990818207607534445906667473124322650515722228065188187669821908561557411250298683640492804460119724596474918469632.0000\n", "Iteration 103/1000, Loss: 501234031283727904168923510849255240942959040898469377606483081835117778989850967224564288709203641540573916034488690617431000394697950295323454737305248330678272.0000\n", "Iteration 104/1000, Loss: 18370993052766355329197259207799418438688689715053316601494442428056419337262389571094902335070543362921817988401846911736764406867924686616739540314660553315319808.0000\n", "Iteration 105/1000, Loss: 673324963351797447332608989985161380160254719506131697881862577432225446063993828190369383132490390061957192511020705212257871234147116613772232666541271813470552064.0000\n", "Iteration 106/1000, Loss: 24678388640750708000781541196692556072340534228351498496656788878968889232512807780573374861860308995343558783500050121467426336042382447521639633373611767508859092992.0000\n", "Iteration 107/1000, Loss: 904500648352959344824103796857883894381798323857146383520466939880502821981417994211917309303906131787978791182342483395959719891696388746736351127440753673841443602432.0000\n", "Iteration 108/1000, Loss: 33151330695877928782926298430980359604468774821069619755101231131819116367304552303408927657611539997723681701147851301300409099557325200104446932725587584270219910053888.0000\n", "Iteration 109/1000, Loss: 1215046919986944990808532106267188354335996166724513763373288619644715170877303909016972142875333279301471785642557528704589212884446095228045148546936288735058024447606784.0000\n", "Iteration 110/1000, Loss: 44533326016784323798717153828261338720081419642840346142275224387233165536089526841285807576988599666509169868984169031858169785313832200623071882472478072819985249090928640.0000\n", "Iteration 111/1000, Loss: 1632214438384410489295808119468895113201598938683321827133537489199965855630604824005844366456980507627253438742158606886428099977617332998804611775533934525807906464682999808.0000\n", "Iteration 112/1000, Loss: 59823152931951366328492381363709529296866857962010047918252173115324925249312001562991352251193857502805358579110522739353290384006293671740570495457348281342109416895669075968.0000\n", "Iteration 113/1000, Loss: 2192609955259310186492656276555460075010983508200088934391743613025745877279577193993110896601997788828312143367838885193389312241452566140337939060701320693744855774949594365952.0000\n", "Iteration 114/1000, Loss: 80362504821014523154743569839844136881509224552693795660858430158624129773702268053501221140377832278455965499974861286532586851900235366891490624176400733434053286992594226642944.0000\n", "Iteration 115/1000, Loss: 2945408582870276389545646333794076701046102068612924450289113713567733542334278724421150667377563264814622619309515030825870261880306617240897096570518501410485025071930112967966720.0000\n", "Iteration 116/1000, Loss: 107953724680036282098530369070945917697880774689369315853757533122970686258125333121174012456575195120311327614383511327446398224866252568405300657255391234749238383455416108760170496.0000\n", "Iteration 117/1000, Loss: 3956668945717657574874917123462789961762552307285586725054613891928156746132092484003568333737406980023216208048820461397303848405656230387397321936380834164322149093316903079528890368.0000\n", "Iteration 118/1000, Loss: 145017962024070627666411150146422113591792858461766305940745281622182539788276165042831068668116459508167416970886877560091732713503924383410259948687975174134662879230962248127724126208.0000\n", "Iteration 119/1000, Loss: 5315129872661191876712115120977151671538425144885233250622479791579582351753261176867292255313073664711056239173040202668787328200395800051475295925870132826458043536874837739490517188608.0000\n", "Iteration 120/1000, Loss: 194807630509703620231771491091800798593577206326576103383337597860571550941466873837739720648076928543048018959156596214635195001847596334227570924237387062465133387670711640759867078082560.0000\n", "Iteration 121/1000, Loss: 7139997293387735646295077032666113835797244182131392689699484164751014452858098922149351353806052013859892710666477337314814468572233961725329259793784020272611182884295982194519020383764480.0000\n", "Iteration 122/1000, Loss: 261691809587740114497768142493119976613584930983941838738150171917669647339938450841962043475928177182255685409035290765111054968154510816611145320299537043125678296512458478882655393573502976.0000\n", "Iteration 123/1000, Loss: 9591404645030738445534984920930116291114265278058529689699724518464834755362528267071190188171861064762660031184248066906908534766652780847582716912977546593918667462845588453218507138036400128.0000\n", "Iteration 124/1000, Loss: 351539634387652125256505397956182676493571107580670063116227027882779481400530289145647824323948951434939822274680660690684823753089651390735284047660389245045418570221571142270152291101054074880.0000\n", "Iteration 125/1000, Loss: 12884464697194314474463790067216898733275731884197116943492523950553744531162972541447933321181969601741737438113565633929814591762808269118402283685582626723359510879978000119135277551284337508352.0000\n", "Iteration 126/1000, Loss: 472235316573674206419429526003114400862560656308588381869746007047476668945428119581153291894618425428134155399811670654178970522729427988522340136315231812698191716752516019535026972780621224476672.0000\n", "Iteration 127/1000, Loss: 17308145853199433834355050218022833791941447824522830183132978831848464445921135438277320328297918214098018456728929210527515303623492124478494125700973546250809002311526877862277916563218452313014272.0000\n", "Iteration 128/1000, Loss: 634369989625475328448804756539742851363068131469172896985262642778649895758015448883122754083842106443632938566071681903065628385567223857126027806458488597837855988274213410964475599445085906880430080.0000\n", "Iteration 129/1000, Loss: 23250629336650573240210833959172119069802711001478512483928172160403329076380867939402710611587169275831916004069112555113778437000351198478505529590440920620133834202224350572947759345474501355077894144.0000\n", "Iteration 130/1000, Loss: 852171088467592056580635366802589492441904514165275277542715686961277491042619995067263826520389938966917358304038501493337761761022971708845794745331447038354097013897859658655318674924806711001013551104.0000\n", "Iteration 131/1000, Loss: 31233372374799302477025749170134332257314438161465692759765264058505859888563784210107677970281657472259106191315666061206316083726722417987736547705837695674490786052774850137931755347022567072130148597760.0000\n", "Iteration 132/1000, Loss: 1144750817182851775205328124053928614066866667205247392402363053942658913650839206924710986090514417944211543428536147146653573957848305761521637614363446790174489583476360160451240348131841750035151405973504.0000\n", "Iteration 133/1000, Loss: 41956866447702242547256112841346360029311374910159471230353509965549002695481138726337294964296548594642211868736415530858838037411228905605919439738258195601610945285924232756355815815638899286000229182078976.0000\n", "Iteration 134/1000, Loss: 1537783258755373374468461876668726499954956524936722365251263569334756980626627425289172877993244358181048134672678456651620555350956934030975887519850690557184919650775712267775074435369335411517816049711972352.0000\n", "Iteration 135/1000, Loss: 56362105922660058269231512395870069272712089208264903583576235460332171565150853973840990000126217314631030970969435360907444344572855488616622999318916522973916616063957219267842172259102086732176185815012999168.0000\n", "Iteration 136/1000, Loss: 2065757294437090040962056947951268059661665296496134192246664000085338882234656780248951621435824496263845338141004045671534184804286066667622366544296020124333239822372701280573571082853693609594139024529272340480.0000\n", "Iteration 137/1000, Loss: 75713160991104524619157327824993410277503172083738214392677707165729131033982169398731602372046370459646824731101243140215755147449186790482679321673200493609504723307448228105919005540451235143105341648054683435008.0000\n", "Iteration 138/1000, Loss: 2775003028043034111258511033562086599130774447547608456925916811906328816203101771275020842672701178295337664002147022881118709685618062830376654413057787621225800589008730857186672050777399608144068070468516661166080.0000\n", "Iteration 139/1000, Loss: 101708100742917756185116202253855370028511843470139659709680691049794767939996705867924137462977045942424781945152442472104131021007075786859882162206754776743513232900993294916742423267454363617951953672330222021640192.0000\n", "Iteration 140/1000, Loss: 3727757286098026970217465509681578951964992656261318674777201428386594213852549615207543232749226819712117698884645008270210933801290133658684037816915838509757577458716980767335888168089618053859386565876630407931232256.0000\n", "Iteration 141/1000, Loss: 136627999958248729027224308014954378604080684494049886398899996289249538611189081728839720890657023853498220890192586513070915824593684314986504582047380212610657457311045250613367572470335990019017042957137047846500958208.0000\n", "Iteration 142/1000, Loss: 5007624944415529610063052593755260003541502863934130180666897746619184841011573275536559287168677788503840848143056951919640924956981908593128451167315378618127235361512114907197249267276657503314966643359060809599764398080.0000\n", "Iteration 143/1000, Loss: 183537105070670321092978667849228144699145509105918140045099914682284444146439923021593535058799311566462678942305136201858270415363822624054531204734665673020127820668265216233205403609666863170093892721934623425023612813312.0000\n", "Iteration 144/1000, Loss: 6726915316469244891905485544013351739075194777888873935805886535636649204349323110592258787548834291890844011687147030300290539444798368950581887909379722165354605838840992475807513640460369429080673923174072348825648311042048.0000\n", "Iteration 145/1000, Loss: 246551724009837912591065275291633234961069438655696599780110858887767936824120326532570193259077184355299976833784763004715775802240070999595572011708472320665849301172092223515414936089270757011404301865624697678231109707497472.0000\n", "Iteration 146/1000, Loss: 9036497376947641157977616061485799921044176845539873671600395063486336343366933664905012837232576092043918375980587888254626794146037459123494198913050038257242701415864699870406721541373490455610100865490316468071066848660029440.0000\n", "Iteration 147/1000, Loss: 331201435201983468963453320588923432195584304359094135671631279143708922183773981916960828745844827310729855952539030270080850199527255292012333394242509268370164174734742382948071176219884674907701681575660836686177586263409819648.0000\n", "Iteration 148/1000, Loss: 12139038623492227801681482275248089980046479613904579449931605848182179050867369909896202042703506818792730493292207644465065509512674083620675439388912981812898285610334194274742267460240022052381912140383919593761659922359210999808.0000\n", "Iteration 149/1000, Loss: 444914312079507639476762939934913667162439005793405156094160885546645839150924927718866875936118814627634310520603628156712293465909751352709579702222519987292042242164376435127151366563692267482910481434033641003014236818546890899456.0000\n", "Iteration 150/1000, Loss: 16306789296320281353224705227149590182107546401967873004537997982201976842491604682789032434206580574665048872760262326703990860013798213285007265988505782925598419490453091337236325626878445848011201301326269059178080459678751173640192.0000\n", "Iteration 151/1000, Loss: 597668741901623502603744446244695104047439009372081980962407282902771533747053027347260987632225647846449570236256304447500303926073321355988545701375815274290540668439356868452028648617903624990148294134999586256091046898031181342703616.0000\n", "Iteration 152/1000, Loss: 21905472533877379932418258635557315674985238725277118109313971741680988159558066169483103685183812861635772432172422480409106561618944171049026780205989743084059408105894735232655028958572581540288227297709773587979448712894599583905087488.0000\n", "Iteration 153/1000, Loss: 802869036459396416050097041991804077263387463604326469118212095849913577574502600615222242564853018595991537644247779135042023304108045110597154963333039994644368285016703332052628967240155165502279276350948525417350621811024096323021307904.0000\n", "Iteration 154/1000, Loss: 29426376842971601253411588371318201213834510923672855284805478412203798692990187118355921498419833672606669387661060103279514995527429335584431890074618787995891850605333298044875970559628096036346618463259125992616289792420575145726195007488.0000\n", "Iteration 155/1000, Loss: 1078521670138373900883032715127325325286649611883687469541745738525928914614210369965212851065217287833309515438390438992076522571189979308348352650419293541322841921212561332589582557577200934634899857232964733187080028190423444080953871302656.0000\n", "Iteration 156/1000, Loss: 39529467020874367450955199625067052470003414048001396012613030787269996340870958051817844579215519814807663299714362371796392407178374507280316731955182771784504005582374454783132038935632789605087391426244518025528353517301684329935908515086336.0000\n", "Iteration 157/1000, Loss: 1448815361080243020300282410421289457054321889448150930327709413317001616838866380452815365065433539687883838027301984762281255322446435531187930549050214827717338063170301498205382821521224098529461183528579548320776785801437182882229334524821504.0000\n", "Iteration 158/1000, Loss: 53101296544009031430512979475682945535839580028917733816422084752297685142931726783537773458515539514761132467873480706558829690250632859909153294607220474538375252350837774022572283761232518348007825743676518703426769378911110041517077674556653568.0000\n", "Iteration 159/1000, Loss: 1946243648709223938657704185482966370465930339466016238107720991749731700011611080273421598646999849856703178801888784861587025531073442402837947309316365144053921205866147601316694130004035419417916781916463933643091698123959955770606363186217091072.0000\n", "Iteration 160/1000, Loss: 71332803277255333829403146194935824191361309991897312179367664807846033698232027902184149458807433147758011777009933778463276626102461563557051875463035131876846186475878097055838886209341188559222245398137619360731515339079892104861947824765612851200.0000\n", "Iteration 161/1000, Loss: 2614456225337606993363544266188538604005764090223381566914230590123867919290383816737800199050497952287262889770813858667030653528922030915684589134117771330578704261698770151170796161694341584590901239512566189058949689327346760120191702922166540435456.0000\n", "Iteration 162/1000, Loss: 95823815133675629217868218034747871477482584496256089893938321146888137952102887387414793832928890741269829187475929225779330555903197482546713105738007830750611149443404939660966114820900482790504581102841724184723445628738156370661315359672607187140608.0000\n", "Iteration 163/1000, Loss: 3512089228262805657563412708444127084844875881179094137022473783907421909132625292654564994662105598324440369465416050508822668294782943277228547412965452938282343867664610272431299509128563807442297273669897797787253643329645995669660184043692706708324352.0000\n", "Iteration 164/1000, Loss: 128723436131951575994802183102532263111484492678470722430217427444854720755133017976195527307701838375544536682870014142824526952886954366210330808549693732863336175964149002409533276664756140970983093022538911146866644954322396934382206281125971289599639552.0000\n", "Iteration 165/1000, Loss: 4717910603260084477926993945629979905538188373775747201322299408335428642422099247588427448626863683921611119223885929092106092567744028513799470286124076984400350128163740499052768770768650183803291573614190297170552096624330027356437067537364411608662540288.0000\n", "Iteration 166/1000, Loss: 172918631829692969834037623191670990123240944485413863045693989328797855453860437643145877553510953958135310759668104382276110042539722953974625943546079878028206135511202974894889536706459221192571690639274925887575098078285346417140085148598631040332119146496.0000\n", "Iteration 167/1000, Loss: 6337732048841994387806476330941873324676746697402259544833415846453631554198429903015947638268334520996676673620165944262209109561776806161887352746738092631276235721351718000458039497935448389714030842384684673498007949147198896905673767931563275324578043265024.0000\n", "Iteration 168/1000, Loss: 232287562640902401194970859402194956882533443326119194895244892118173729133354175446124691440525796693570803813402553728875693495621392273141634742839424809745736052248630316925123366874924052397061385448356259642489987608019807395408641943908314064661497562267648.0000\n", "Iteration 169/1000, Loss: 8513694069396645394293203503393893430783063429634473767637237977555825251770669917559510840761728952604837444701221908551922756465141857435213685546707222972815444291561783405920028658733490000404738621740042319706871521303339050608921952081243573998993454732935168.0000\n", "Iteration 170/1000, Loss: 312039895219583599200873341404261089906204815265739414875080777969926809448743350235770530884820498051849485162429639470387519256795086861348634519879299515344675130010633527667509671141005648746537411571030973878940308500697333753619859967991944970227193107602997248.0000\n", "Iteration 171/1000, Loss: 11436738907338855759979456875881959126746078864754541349759408741015640200385000415576825683098552503265831293009678513392645475554177791793832669011514632535434370907121767993547273560219694430125590525297574181829950725014568856483521899003678479415428883546674036736.0000\n", "Iteration 172/1000, Loss: 419173954479745814925206741812356985065270519367469020692344031437667486844385764017202803724376800445051117135373048262035909112639988860809663919761359316266221129329531331625159973161801051883471307667042230196292089164652618790127224766444766646630495954840189927424.0000\n", "Iteration 173/1000, Loss: 15363365863099179975390842847763694620108551653618044448181078913577398679106202428046072347876864782431923960928195568524396188587335151406125139975969044125123011223446728759165285928434873300173518347031577535617520200557461034744922823737638807944189274282725512052736.0000\n", "Iteration 174/1000, Loss: 563090831672477330208644926314807063788975016268580507864253072697485367456460993393709629487089946753653288398776418398579099330005550758328016491505659433705343805867677384132934799222709101349808927631226512248810797890098969338186136349747237699972369909513674758815744.0000\n", "Iteration 175/1000, Loss: 20638139294408554823298258850251033167049678908981832238626763856547190458854952150609358677281459144877110044347786755465764905816114614138365829913256939362199328535072953287827450489757392392775921859809819109471232820934295115506148556328775989857050581771260246496378880.0000\n", "Iteration 176/1000, Loss: 756419336948385810618296436865789538707484863002824402510809246358110016227425215165755590248834347886732819508410596552962235428396314364217252629736488257537866967015574219477144950398882401836133977648868585064755525446486182315423612395511340327801606917237189948204908544.0000\n", "Iteration 177/1000, Loss: 27723924388108601683950249674160563162262342339853170876474853035855180045506996033003693286192762329860590627736791080588737709162532549697645274832207316483358655268422641599783982808425363698159939699829530785559874904203200247445531050572777246156438130276475335309852147712.0000\n", "Iteration 178/1000, Loss: 1016124186589916328036425899545103685617126163557903970201997098879189354384347191205206324096428453431011669497402391952380605180971232060975830255822070151976933880810523326184194798924290338674480123302747973736429587784952094479853091257401114452740639980299808509257228222464.0000\n", "Iteration 179/1000, Loss: 37242503915350638353251953247811197175502734262949042706975416988023652918208602793991487839472839892097684175665939443788253997897652147161457652800162576315696048178002457222183323347622721125590012870817366151737993838237416399666803529329333772702195146065291771694216617394176.0000\n", "Iteration 180/1000, Loss: 1364994669145366765082703642369118934168412404443327310078705223730300396263521943125078338284948431311339343496677358692311274744069981294017623736315683925917054164312179977501709444489576297658878012698820312338824560973986656625949777863167263848600291831247196341489459701743616.0000\n", "Iteration 181/1000, Loss: 50029140119853476481105242844571515845448267111730152912246911601205357456120346835741788920798475215934293153031200281779686133541597131028084462227491070011954572440888233412638247622276949942526417729880610530963278370153968002240384205942557194291952758685835336080582676433600512.0000\n", "Iteration 182/1000, Loss: 1833644421995454967609442239501325489754914182456114529397978597154423637328044944117043192005477651389645996341227105281123887916639316398853616944781057714316079759393907646714938486870852810040241406839573323367245279431781059787375926926362787211696272431757331859923494499816308736.0000\n", "Iteration 183/1000, Loss: 67205869584409981830157057634943018314977397212428627748217971970095213754178780052235856688745532649115918097900315719115229873589002586118462408626611541630169431496601599855442100232002888489323319633571096502480806013058907104648108865735662741167939556310820403545938727575159308288.0000\n", "Iteration 184/1000, Loss: 2463197800193737674057238807597015360585598605991465812376410084131458706905598606523070786082801177625979150480985666205009702006840942586945466626930658211672133609274298459851215659587289795045292296084383306889252968889838746001091583798208559934798612498622193614696364702931949191168.0000\n", "Iteration 185/1000, Loss: 90279962753234502425574771985505359343528337648453084018648025436296588742176812097569650092625468275970799836613359003112650599756338619599144324167057331323795650846600748608674797984829395124267413742701054601270817218804507137933044629099503341239944622195224129221587953013772712411136.0000\n", "Iteration 186/1000, Loss: 3308898568391198678448159602506136147810662630142504777063786500653145281928173724374134970326453481880162940044221852008043419884382664524461416592792604214719215363643264983490898205253471534135212863391921786604661009265882561849683370590666194019285725294032273816459830501273709951582208.0000\n", "Iteration 187/1000, Loss: 121276188004508833981251740739403106095394738377135219393403200647963405244252674782963356474502565899438050729863003728957507245898533680005957996532339121199326319622104755136763496615423881499998744513161913898300293640788309264259892166655608633869624844458820146241378249398814503669334016.0000\n", "Iteration 188/1000, Loss: 4444957581173614372373484981168094943591469602286020274549188301692806036842657989280075424257976889502207813277978184912344165184288239728438511736607646650871916258409973038894900347998083464192029073390228347424489075579475375308209613850512922717013204461952849117397699753933389795252240384.0000\n", "Iteration 189/1000, Loss: 162914486541234572932348407845652534107377768097554751821281914427506419553438999462166855771379493584157688021363364169790723565720209954400870688519427984343762840437635143124495557361926011344925840251884511157730145246265620429423489322201338426843049542572320874495724321519119646506283433984.0000\n", "Iteration 190/1000, Loss: 5971064839270383481288943040789294685418109538407852004126853465675886394559049811518964422625636752784215638199396974510444653490328414742077945341014665968298598789357011133944345881782376889358160627679569749432922529936360329169892386732320488056418643697272122766112904710788642384462095581184.0000\n", "Iteration 191/1000, Loss: 218848649200676932665811080041695780085679515997475258392314137667087316384585335115325197802858203886439806574232169148353548844588156585927230675060734339461085395389786784780697376567258302548669412866485114949108822799326453723827392457034472348136695344523148959762588703061498878271694053572608.0000\n", "Iteration 192/1000, Loss: 8021137359281012931696121230461087628171517055412253395675844243525449838493636918893335671749090697616145422250910841709008399047120783473578392559939954475297944849527289233697178062397367206465536669888911363795770694300444218524345226593943729161882306744679673083888782202855078091277601630846976.0000\n", "Iteration 193/1000, Loss: 293986939245200418344169205002296310052395962997480427793822557791014281577613394531105986442181475574533365330141336387704564011155829509118014110901307757508856009489169040355041304945452497014414526642454449970426684145948775646161980362199524707005622795213588902198613612607990151501289078105571328.0000\n", "Iteration 194/1000, Loss: inf\n", "Iteration 195/1000, Loss: inf\n", "Iteration 196/1000, Loss: inf\n", "Iteration 197/1000, Loss: inf\n", "Iteration 198/1000, Loss: inf\n", "Iteration 199/1000, Loss: inf\n", "Iteration 200/1000, Loss: inf\n", "Iteration 201/1000, Loss: inf\n", "Iteration 202/1000, Loss: inf\n", "Iteration 203/1000, Loss: inf\n", "Iteration 204/1000, Loss: inf\n", "Iteration 205/1000, Loss: inf\n", "Iteration 206/1000, Loss: inf\n", "Iteration 207/1000, Loss: inf\n", "Iteration 208/1000, Loss: inf\n", "Iteration 209/1000, Loss: inf\n", "Iteration 210/1000, Loss: inf\n", "Iteration 211/1000, Loss: inf\n", "Iteration 212/1000, Loss: inf\n", "Iteration 213/1000, Loss: inf\n", "Iteration 214/1000, Loss: inf\n", "Iteration 215/1000, Loss: inf\n", "Iteration 216/1000, Loss: inf\n", "Iteration 217/1000, Loss: inf\n", "Iteration 218/1000, Loss: inf\n", "Iteration 219/1000, Loss: inf\n", "Iteration 220/1000, Loss: inf\n", "Iteration 221/1000, Loss: inf\n", "Iteration 222/1000, Loss: inf\n", "Iteration 223/1000, Loss: inf\n", "Iteration 224/1000, Loss: inf\n", "Iteration 225/1000, Loss: inf\n", "Iteration 226/1000, Loss: inf\n", "Iteration 227/1000, Loss: inf\n", "Iteration 228/1000, Loss: inf\n", "Iteration 229/1000, Loss: inf\n", "Iteration 230/1000, Loss: inf\n", "Iteration 231/1000, Loss: inf\n", "Iteration 232/1000, Loss: inf\n", "Iteration 233/1000, Loss: inf\n", "Iteration 234/1000, Loss: inf\n", "Iteration 235/1000, Loss: inf\n", "Iteration 236/1000, Loss: inf\n", "Iteration 237/1000, Loss: inf\n", "Iteration 238/1000, Loss: inf\n", "Iteration 239/1000, Loss: inf\n", "Iteration 240/1000, Loss: inf\n", "Iteration 241/1000, Loss: inf\n", "Iteration 242/1000, Loss: inf\n", "Iteration 243/1000, Loss: inf\n", "Iteration 244/1000, Loss: inf\n", "Iteration 245/1000, Loss: inf\n", "Iteration 246/1000, Loss: inf\n", "Iteration 247/1000, Loss: inf\n", "Iteration 248/1000, Loss: inf\n", "Iteration 249/1000, Loss: inf\n", "Iteration 250/1000, Loss: inf\n", "Iteration 251/1000, Loss: inf\n", "Iteration 252/1000, Loss: inf\n", "Iteration 253/1000, Loss: inf\n", "Iteration 254/1000, Loss: inf\n", "Iteration 255/1000, Loss: inf\n", "Iteration 256/1000, Loss: inf\n", "Iteration 257/1000, Loss: inf\n", "Iteration 258/1000, Loss: inf\n", "Iteration 259/1000, Loss: inf\n", "Iteration 260/1000, Loss: inf\n", "Iteration 261/1000, Loss: inf\n", "Iteration 262/1000, Loss: inf\n", "Iteration 263/1000, Loss: inf\n", "Iteration 264/1000, Loss: inf\n", "Iteration 265/1000, Loss: inf\n", "Iteration 266/1000, Loss: inf\n", "Iteration 267/1000, Loss: inf\n", "Iteration 268/1000, Loss: inf\n", "Iteration 269/1000, Loss: inf\n", "Iteration 270/1000, Loss: inf\n", "Iteration 271/1000, Loss: inf\n", "Iteration 272/1000, Loss: inf\n", "Iteration 273/1000, Loss: inf\n", "Iteration 274/1000, Loss: inf\n", "Iteration 275/1000, Loss: inf\n", "Iteration 276/1000, Loss: inf\n", "Iteration 277/1000, Loss: inf\n", "Iteration 278/1000, Loss: inf\n", "Iteration 279/1000, Loss: inf\n", "Iteration 280/1000, Loss: inf\n", "Iteration 281/1000, Loss: inf\n", "Iteration 282/1000, Loss: inf\n", "Iteration 283/1000, Loss: inf\n", "Iteration 284/1000, Loss: inf\n", "Iteration 285/1000, Loss: inf\n", "Iteration 286/1000, Loss: inf\n", "Iteration 287/1000, Loss: inf\n", "Iteration 288/1000, Loss: inf\n", "Iteration 289/1000, Loss: inf\n", "Iteration 290/1000, Loss: inf\n", "Iteration 291/1000, Loss: inf\n", "Iteration 292/1000, Loss: inf\n", "Iteration 293/1000, Loss: inf\n", "Iteration 294/1000, Loss: inf\n", "Iteration 295/1000, Loss: inf\n", "Iteration 296/1000, Loss: inf\n", "Iteration 297/1000, Loss: inf\n", "Iteration 298/1000, Loss: inf\n", "Iteration 299/1000, Loss: inf\n", "Iteration 300/1000, Loss: inf\n", "Iteration 301/1000, Loss: inf\n", "Iteration 302/1000, Loss: inf\n", "Iteration 303/1000, Loss: inf\n", "Iteration 304/1000, Loss: inf\n", "Iteration 305/1000, Loss: inf\n", "Iteration 306/1000, Loss: inf\n", "Iteration 307/1000, Loss: inf\n", "Iteration 308/1000, Loss: inf\n", "Iteration 309/1000, Loss: inf\n", "Iteration 310/1000, Loss: inf\n", "Iteration 311/1000, Loss: inf\n", "Iteration 312/1000, Loss: inf\n", "Iteration 313/1000, Loss: inf\n", "Iteration 314/1000, Loss: inf\n", "Iteration 315/1000, Loss: inf\n", "Iteration 316/1000, Loss: inf\n", "Iteration 317/1000, Loss: inf\n", "Iteration 318/1000, Loss: inf\n", "Iteration 319/1000, Loss: inf\n", "Iteration 320/1000, Loss: inf\n", "Iteration 321/1000, Loss: inf\n", "Iteration 322/1000, Loss: inf\n", "Iteration 323/1000, Loss: inf\n", "Iteration 324/1000, Loss: inf\n", "Iteration 325/1000, Loss: inf\n", "Iteration 326/1000, Loss: inf\n", "Iteration 327/1000, Loss: inf\n", "Iteration 328/1000, Loss: inf\n", "Iteration 329/1000, Loss: inf\n", "Iteration 330/1000, Loss: inf\n", "Iteration 331/1000, Loss: inf\n", "Iteration 332/1000, Loss: inf\n", "Iteration 333/1000, Loss: inf\n", "Iteration 334/1000, Loss: inf\n", "Iteration 335/1000, Loss: inf\n", "Iteration 336/1000, Loss: inf\n", "Iteration 337/1000, Loss: inf\n", "Iteration 338/1000, Loss: inf\n", "Iteration 339/1000, Loss: inf\n", "Iteration 340/1000, Loss: inf\n", "Iteration 341/1000, Loss: inf\n", "Iteration 342/1000, Loss: inf\n", "Iteration 343/1000, Loss: inf\n", "Iteration 344/1000, Loss: inf\n", "Iteration 345/1000, Loss: inf\n", "Iteration 346/1000, Loss: inf\n", "Iteration 347/1000, Loss: inf\n", "Iteration 348/1000, Loss: inf\n", "Iteration 349/1000, Loss: inf\n", "Iteration 350/1000, Loss: inf\n", "Iteration 351/1000, Loss: inf\n", "Iteration 352/1000, Loss: inf\n", "Iteration 353/1000, Loss: inf\n", "Iteration 354/1000, Loss: inf\n", "Iteration 355/1000, Loss: inf\n", "Iteration 356/1000, Loss: inf\n", "Iteration 357/1000, Loss: inf\n", "Iteration 358/1000, Loss: inf\n" ] }, { "name": "stderr", "output_type": "stream", "text": [ "c:\\Users\\pthomas\\AppData\\Local\\Programs\\Python\\Python311\\Lib\\site-packages\\numpy\\core\\_methods.py:49: RuntimeWarning: overflow encountered in reduce\n", " return umr_sum(a, axis, dtype, out, keepdims, initial, where)\n" ] }, { "name": "stdout", "output_type": "stream", "text": [ "Iteration 359/1000, Loss: inf\n", "Iteration 360/1000, Loss: inf\n", "Iteration 361/1000, Loss: inf\n", "Iteration 362/1000, Loss: inf\n", "Iteration 363/1000, Loss: inf\n", "Iteration 364/1000, Loss: inf\n", "Iteration 365/1000, Loss: inf\n", "Iteration 366/1000, Loss: inf\n", "Iteration 367/1000, Loss: inf\n", "Iteration 368/1000, Loss: inf\n", "Iteration 369/1000, Loss: inf\n", "Iteration 370/1000, Loss: inf\n", "Iteration 371/1000, Loss: inf\n", "Iteration 372/1000, Loss: inf\n", "Iteration 373/1000, Loss: inf\n", "Iteration 374/1000, Loss: inf\n", "Iteration 375/1000, Loss: inf\n", "Iteration 376/1000, Loss: inf\n", "Iteration 377/1000, Loss: inf\n", "Iteration 378/1000, Loss: inf\n", "Iteration 379/1000, Loss: inf\n", "Iteration 380/1000, Loss: inf\n", "Iteration 381/1000, Loss: inf\n", "Iteration 382/1000, Loss: inf\n", "Iteration 383/1000, Loss: inf\n", "Iteration 384/1000, Loss: inf\n", "Iteration 385/1000, Loss: inf\n", "Iteration 386/1000, Loss: inf\n", "Iteration 387/1000, Loss: inf\n", "Iteration 388/1000, Loss: inf\n", "Iteration 389/1000, Loss: inf\n", "Iteration 390/1000, Loss: nan\n", "Iteration 391/1000, Loss: nan\n", "Iteration 392/1000, Loss: nan\n", "Iteration 393/1000, Loss: nan\n", "Iteration 394/1000, Loss: nan\n", "Iteration 395/1000, Loss: nan\n", "Iteration 396/1000, Loss: nan\n", "Iteration 397/1000, Loss: nan\n", "Iteration 398/1000, Loss: nan\n", "Iteration 399/1000, Loss: nan\n", "Iteration 400/1000, Loss: nan\n", "Iteration 401/1000, Loss: nan\n", "Iteration 402/1000, Loss: nan\n", "Iteration 403/1000, Loss: nan\n", "Iteration 404/1000, Loss: nan\n", "Iteration 405/1000, Loss: nan\n", "Iteration 406/1000, Loss: nan\n", "Iteration 407/1000, Loss: nan\n", "Iteration 408/1000, Loss: nan\n", "Iteration 409/1000, Loss: nan\n", "Iteration 410/1000, Loss: nan\n", "Iteration 411/1000, Loss: nan\n", "Iteration 412/1000, Loss: nan\n", "Iteration 413/1000, Loss: nan\n", "Iteration 414/1000, Loss: nan\n", "Iteration 415/1000, Loss: nan\n", "Iteration 416/1000, Loss: nan\n", "Iteration 417/1000, Loss: nan\n", "Iteration 418/1000, Loss: nan\n", "Iteration 419/1000, Loss: nan\n", "Iteration 420/1000, Loss: nan\n", "Iteration 421/1000, Loss: nan\n", "Iteration 422/1000, Loss: nan\n", "Iteration 423/1000, Loss: nan\n", "Iteration 424/1000, Loss: nan\n", "Iteration 425/1000, Loss: nan\n", "Iteration 426/1000, Loss: nan\n", "Iteration 427/1000, Loss: nan\n", "Iteration 428/1000, Loss: nan\n", "Iteration 429/1000, Loss: nan\n", "Iteration 430/1000, Loss: nan\n", "Iteration 431/1000, Loss: nan\n", "Iteration 432/1000, Loss: nan\n", "Iteration 433/1000, Loss: nan\n", "Iteration 434/1000, Loss: nan\n", "Iteration 435/1000, Loss: nan\n", "Iteration 436/1000, Loss: nan\n", "Iteration 437/1000, Loss: nan\n", "Iteration 438/1000, Loss: nan\n", "Iteration 439/1000, Loss: nan\n", "Iteration 440/1000, Loss: nan\n", "Iteration 441/1000, Loss: nan\n", "Iteration 442/1000, Loss: nan\n", "Iteration 443/1000, Loss: nan\n", "Iteration 444/1000, Loss: nan\n", "Iteration 445/1000, Loss: nan\n", "Iteration 446/1000, Loss: nan\n", "Iteration 447/1000, Loss: nan\n", "Iteration 448/1000, Loss: nan\n", "Iteration 449/1000, Loss: nan\n", "Iteration 450/1000, Loss: nan\n", "Iteration 451/1000, Loss: nan\n", "Iteration 452/1000, Loss: nan\n", "Iteration 453/1000, Loss: nan\n", "Iteration 454/1000, Loss: nan\n", "Iteration 455/1000, Loss: nan\n", "Iteration 456/1000, Loss: nan\n", "Iteration 457/1000, Loss: nan\n", "Iteration 458/1000, Loss: nan\n", "Iteration 459/1000, Loss: nan\n", "Iteration 460/1000, Loss: nan\n", "Iteration 461/1000, Loss: nan\n", "Iteration 462/1000, Loss: nan\n", "Iteration 463/1000, Loss: nan\n", "Iteration 464/1000, Loss: nan\n", "Iteration 465/1000, Loss: nan\n", "Iteration 466/1000, Loss: nan\n", "Iteration 467/1000, Loss: nan\n", "Iteration 468/1000, Loss: nan\n", "Iteration 469/1000, Loss: nan\n", "Iteration 470/1000, Loss: nan\n", "Iteration 471/1000, Loss: nan\n", "Iteration 472/1000, Loss: nan\n", "Iteration 473/1000, Loss: nan\n", "Iteration 474/1000, Loss: nan\n", "Iteration 475/1000, Loss: nan\n", "Iteration 476/1000, Loss: nan\n", "Iteration 477/1000, Loss: nan\n", "Iteration 478/1000, Loss: nan\n", "Iteration 479/1000, Loss: nan\n", "Iteration 480/1000, Loss: nan\n", "Iteration 481/1000, Loss: nan\n", "Iteration 482/1000, Loss: nan\n", "Iteration 483/1000, Loss: nan\n", "Iteration 484/1000, Loss: nan\n", "Iteration 485/1000, Loss: nan\n", "Iteration 486/1000, Loss: nan\n", "Iteration 487/1000, Loss: nan\n", "Iteration 488/1000, Loss: nan\n", "Iteration 489/1000, Loss: nan\n", "Iteration 490/1000, Loss: nan\n", "Iteration 491/1000, Loss: nan\n", "Iteration 492/1000, Loss: nan\n", "Iteration 493/1000, Loss: nan\n", "Iteration 494/1000, Loss: nan\n", "Iteration 495/1000, Loss: nan\n", "Iteration 496/1000, Loss: nan\n", "Iteration 497/1000, Loss: nan\n", "Iteration 498/1000, Loss: nan\n", "Iteration 499/1000, Loss: nan\n", "Iteration 500/1000, Loss: nan\n", "Iteration 501/1000, Loss: nan\n", "Iteration 502/1000, Loss: nan\n", "Iteration 503/1000, Loss: nan\n", "Iteration 504/1000, Loss: nan\n", "Iteration 505/1000, Loss: nan\n", "Iteration 506/1000, Loss: nan\n", "Iteration 507/1000, Loss: nan\n", "Iteration 508/1000, Loss: nan\n", "Iteration 509/1000, Loss: nan\n", "Iteration 510/1000, Loss: nan\n", "Iteration 511/1000, Loss: nan\n", "Iteration 512/1000, Loss: nan\n", "Iteration 513/1000, Loss: nan\n", "Iteration 514/1000, Loss: nan\n", "Iteration 515/1000, Loss: nan\n", "Iteration 516/1000, Loss: nan\n", "Iteration 517/1000, Loss: nan\n", "Iteration 518/1000, Loss: nan\n", "Iteration 519/1000, Loss: nan\n", "Iteration 520/1000, Loss: nan\n", "Iteration 521/1000, Loss: nan\n", "Iteration 522/1000, Loss: nan\n", "Iteration 523/1000, Loss: nan\n", "Iteration 524/1000, Loss: nan\n" ] }, { "name": "stderr", "output_type": "stream", "text": [ "C:\\Users\\pthomas\\AppData\\Local\\Temp\\ipykernel_17324\\2087762327.py:57: RuntimeWarning: invalid value encountered in subtract\n", " self.weights -= self.learning_rate * gradients\n" ] }, { "name": "stdout", "output_type": "stream", "text": [ "Iteration 525/1000, Loss: nan\n", "Iteration 526/1000, Loss: nan\n", "Iteration 527/1000, Loss: nan\n", "Iteration 528/1000, Loss: nan\n", "Iteration 529/1000, Loss: nan\n", "Iteration 530/1000, Loss: nan\n", "Iteration 531/1000, Loss: nan\n", "Iteration 532/1000, Loss: nan\n", "Iteration 533/1000, Loss: nan\n", "Iteration 534/1000, Loss: nan\n", "Iteration 535/1000, Loss: nan\n", "Iteration 536/1000, Loss: nan\n", "Iteration 537/1000, Loss: nan\n", "Iteration 538/1000, Loss: nan\n", "Iteration 539/1000, Loss: nan\n", "Iteration 540/1000, Loss: nan\n", "Iteration 541/1000, Loss: nan\n", "Iteration 542/1000, Loss: nan\n", "Iteration 543/1000, Loss: nan\n", "Iteration 544/1000, Loss: nan\n", "Iteration 545/1000, Loss: nan\n", "Iteration 546/1000, Loss: nan\n", "Iteration 547/1000, Loss: nan\n", "Iteration 548/1000, Loss: nan\n", "Iteration 549/1000, Loss: nan\n", "Iteration 550/1000, Loss: nan\n", "Iteration 551/1000, Loss: nan\n", "Iteration 552/1000, Loss: nan\n", "Iteration 553/1000, Loss: nan\n", "Iteration 554/1000, Loss: nan\n", "Iteration 555/1000, Loss: nan\n", "Iteration 556/1000, Loss: nan\n", "Iteration 557/1000, Loss: nan\n", "Iteration 558/1000, Loss: nan\n", "Iteration 559/1000, Loss: nan\n", "Iteration 560/1000, Loss: nan\n", "Iteration 561/1000, Loss: nan\n", "Iteration 562/1000, Loss: nan\n", "Iteration 563/1000, Loss: nan\n", "Iteration 564/1000, Loss: nan\n", "Iteration 565/1000, Loss: nan\n", "Iteration 566/1000, Loss: nan\n", "Iteration 567/1000, Loss: nan\n", "Iteration 568/1000, Loss: nan\n", "Iteration 569/1000, Loss: nan\n", "Iteration 570/1000, Loss: nan\n", "Iteration 571/1000, Loss: nan\n", "Iteration 572/1000, Loss: nan\n", "Iteration 573/1000, Loss: nan\n", "Iteration 574/1000, Loss: nan\n", "Iteration 575/1000, Loss: nan\n", "Iteration 576/1000, Loss: nan\n", "Iteration 577/1000, Loss: nan\n", "Iteration 578/1000, Loss: nan\n", "Iteration 579/1000, Loss: nan\n", "Iteration 580/1000, Loss: nan\n", "Iteration 581/1000, Loss: nan\n", "Iteration 582/1000, Loss: nan\n", "Iteration 583/1000, Loss: nan\n", "Iteration 584/1000, Loss: nan\n", "Iteration 585/1000, Loss: nan\n", "Iteration 586/1000, Loss: nan\n", "Iteration 587/1000, Loss: nan\n", "Iteration 588/1000, Loss: nan\n", "Iteration 589/1000, Loss: nan\n", "Iteration 590/1000, Loss: nan\n", "Iteration 591/1000, Loss: nan\n", "Iteration 592/1000, Loss: nan\n", "Iteration 593/1000, Loss: nan\n", "Iteration 594/1000, Loss: nan\n", "Iteration 595/1000, Loss: nan\n", "Iteration 596/1000, Loss: nan\n", "Iteration 597/1000, Loss: nan\n", "Iteration 598/1000, Loss: nan\n", "Iteration 599/1000, Loss: nan\n", "Iteration 600/1000, Loss: nan\n", "Iteration 601/1000, Loss: nan\n", "Iteration 602/1000, Loss: nan\n", "Iteration 603/1000, Loss: nan\n", "Iteration 604/1000, Loss: nan\n", "Iteration 605/1000, Loss: nan\n", "Iteration 606/1000, Loss: nan\n", "Iteration 607/1000, Loss: nan\n", "Iteration 608/1000, Loss: nan\n", "Iteration 609/1000, Loss: nan\n", "Iteration 610/1000, Loss: nan\n", "Iteration 611/1000, Loss: nan\n", "Iteration 612/1000, Loss: nan\n", "Iteration 613/1000, Loss: nan\n", "Iteration 614/1000, Loss: nan\n", "Iteration 615/1000, Loss: nan\n", "Iteration 616/1000, Loss: nan\n", "Iteration 617/1000, Loss: nan\n", "Iteration 618/1000, Loss: nan\n", "Iteration 619/1000, Loss: nan\n", "Iteration 620/1000, Loss: nan\n", "Iteration 621/1000, Loss: nan\n", "Iteration 622/1000, Loss: nan\n", "Iteration 623/1000, Loss: nan\n", "Iteration 624/1000, Loss: nan\n", "Iteration 625/1000, Loss: nan\n", "Iteration 626/1000, Loss: nan\n", "Iteration 627/1000, Loss: nan\n", "Iteration 628/1000, Loss: nan\n", "Iteration 629/1000, Loss: nan\n", "Iteration 630/1000, Loss: nan\n", "Iteration 631/1000, Loss: nan\n", "Iteration 632/1000, Loss: nan\n", "Iteration 633/1000, Loss: nan\n", "Iteration 634/1000, Loss: nan\n", "Iteration 635/1000, Loss: nan\n", "Iteration 636/1000, Loss: nan\n", "Iteration 637/1000, Loss: nan\n", "Iteration 638/1000, Loss: nan\n", "Iteration 639/1000, Loss: nan\n", "Iteration 640/1000, Loss: nan\n", "Iteration 641/1000, Loss: nan\n", "Iteration 642/1000, Loss: nan\n", "Iteration 643/1000, Loss: nan\n", "Iteration 644/1000, Loss: nan\n", "Iteration 645/1000, Loss: nan\n", "Iteration 646/1000, Loss: nan\n", "Iteration 647/1000, Loss: nan\n", "Iteration 648/1000, Loss: nan\n", "Iteration 649/1000, Loss: nan\n", "Iteration 650/1000, Loss: nan\n", "Iteration 651/1000, Loss: nan\n", "Iteration 652/1000, Loss: nan\n", "Iteration 653/1000, Loss: nan\n", "Iteration 654/1000, Loss: nan\n", "Iteration 655/1000, Loss: nan\n", "Iteration 656/1000, Loss: nan\n", "Iteration 657/1000, Loss: nan\n", "Iteration 658/1000, Loss: nan\n", "Iteration 659/1000, Loss: nan\n", "Iteration 660/1000, Loss: nan\n", "Iteration 661/1000, Loss: nan\n", "Iteration 662/1000, Loss: nan\n", "Iteration 663/1000, Loss: nan\n", "Iteration 664/1000, Loss: nan\n", "Iteration 665/1000, Loss: nan\n", "Iteration 666/1000, Loss: nan\n", "Iteration 667/1000, Loss: nan\n", "Iteration 668/1000, Loss: nan\n", "Iteration 669/1000, Loss: nan\n", "Iteration 670/1000, Loss: nan\n", "Iteration 671/1000, Loss: nan\n", "Iteration 672/1000, Loss: nan\n", "Iteration 673/1000, Loss: nan\n", "Iteration 674/1000, Loss: nan\n", "Iteration 675/1000, Loss: nan\n", "Iteration 676/1000, Loss: nan\n", "Iteration 677/1000, Loss: nan\n", "Iteration 678/1000, Loss: nan\n", "Iteration 679/1000, Loss: nan\n", "Iteration 680/1000, Loss: nan\n", "Iteration 681/1000, Loss: nan\n", "Iteration 682/1000, Loss: nan\n", "Iteration 683/1000, Loss: nan\n", "Iteration 684/1000, Loss: nan\n", "Iteration 685/1000, Loss: nan\n", "Iteration 686/1000, Loss: nan\n", "Iteration 687/1000, Loss: nan\n", "Iteration 688/1000, Loss: nan\n", "Iteration 689/1000, Loss: nan\n", "Iteration 690/1000, Loss: nan\n", "Iteration 691/1000, Loss: nan\n", "Iteration 692/1000, Loss: nan\n", "Iteration 693/1000, Loss: nan\n", "Iteration 694/1000, Loss: nan\n", "Iteration 695/1000, Loss: nan\n", "Iteration 696/1000, Loss: nan\n", "Iteration 697/1000, Loss: nan\n", "Iteration 698/1000, Loss: nan\n", "Iteration 699/1000, Loss: nan\n", "Iteration 700/1000, Loss: nan\n", "Iteration 701/1000, Loss: nan\n", "Iteration 702/1000, Loss: nan\n", "Iteration 703/1000, Loss: nan\n", "Iteration 704/1000, Loss: nan\n", "Iteration 705/1000, Loss: nan\n", "Iteration 706/1000, Loss: nan\n", "Iteration 707/1000, Loss: nan\n", "Iteration 708/1000, Loss: nan\n", "Iteration 709/1000, Loss: nan\n", "Iteration 710/1000, Loss: nan\n", "Iteration 711/1000, Loss: nan\n", "Iteration 712/1000, Loss: nan\n", "Iteration 713/1000, Loss: nan\n", "Iteration 714/1000, Loss: nan\n", "Iteration 715/1000, Loss: nan\n", "Iteration 716/1000, Loss: nan\n", "Iteration 717/1000, Loss: nan\n", "Iteration 718/1000, Loss: nan\n", "Iteration 719/1000, Loss: nan\n", "Iteration 720/1000, Loss: nan\n", "Iteration 721/1000, Loss: nan\n", "Iteration 722/1000, Loss: nan\n", "Iteration 723/1000, Loss: nan\n", "Iteration 724/1000, Loss: nan\n", "Iteration 725/1000, Loss: nan\n", "Iteration 726/1000, Loss: nan\n", "Iteration 727/1000, Loss: nan\n", "Iteration 728/1000, Loss: nan\n", "Iteration 729/1000, Loss: nan\n", "Iteration 730/1000, Loss: nan\n", "Iteration 731/1000, Loss: nan\n", "Iteration 732/1000, Loss: nan\n", "Iteration 733/1000, Loss: nan\n", "Iteration 734/1000, Loss: nan\n", "Iteration 735/1000, Loss: nan\n", "Iteration 736/1000, Loss: nan\n", "Iteration 737/1000, Loss: nan\n", "Iteration 738/1000, Loss: nan\n", "Iteration 739/1000, Loss: nan\n", "Iteration 740/1000, Loss: nan\n", "Iteration 741/1000, Loss: nan\n", "Iteration 742/1000, Loss: nan\n", "Iteration 743/1000, Loss: nan\n", "Iteration 744/1000, Loss: nan\n", "Iteration 745/1000, Loss: nan\n", "Iteration 746/1000, Loss: nan\n", "Iteration 747/1000, Loss: nan\n", "Iteration 748/1000, Loss: nan\n", "Iteration 749/1000, Loss: nan\n", "Iteration 750/1000, Loss: nan\n", "Iteration 751/1000, Loss: nan\n", "Iteration 752/1000, Loss: nan\n", "Iteration 753/1000, Loss: nan\n", "Iteration 754/1000, Loss: nan\n", "Iteration 755/1000, Loss: nan\n", "Iteration 756/1000, Loss: nan\n", "Iteration 757/1000, Loss: nan\n", "Iteration 758/1000, Loss: nan\n", "Iteration 759/1000, Loss: nan\n", "Iteration 760/1000, Loss: nan\n", "Iteration 761/1000, Loss: nan\n", "Iteration 762/1000, Loss: nan\n", "Iteration 763/1000, Loss: nan\n", "Iteration 764/1000, Loss: nan\n", "Iteration 765/1000, Loss: nan\n", "Iteration 766/1000, Loss: nan\n", "Iteration 767/1000, Loss: nan\n", "Iteration 768/1000, Loss: nan\n", "Iteration 769/1000, Loss: nan\n", "Iteration 770/1000, Loss: nan\n", "Iteration 771/1000, Loss: nan\n", "Iteration 772/1000, Loss: nan\n", "Iteration 773/1000, Loss: nan\n", "Iteration 774/1000, Loss: nan\n", "Iteration 775/1000, Loss: nan\n", "Iteration 776/1000, Loss: nan\n", "Iteration 777/1000, Loss: nan\n", "Iteration 778/1000, Loss: nan\n", "Iteration 779/1000, Loss: nan\n", "Iteration 780/1000, Loss: nan\n", "Iteration 781/1000, Loss: nan\n", "Iteration 782/1000, Loss: nan\n", "Iteration 783/1000, Loss: nan\n", "Iteration 784/1000, Loss: nan\n", "Iteration 785/1000, Loss: nan\n", "Iteration 786/1000, Loss: nan\n", "Iteration 787/1000, Loss: nan\n", "Iteration 788/1000, Loss: nan\n", "Iteration 789/1000, Loss: nan\n", "Iteration 790/1000, Loss: nan\n", "Iteration 791/1000, Loss: nan\n", "Iteration 792/1000, Loss: nan\n", "Iteration 793/1000, Loss: nan\n", "Iteration 794/1000, Loss: nan\n", "Iteration 795/1000, Loss: nan\n", "Iteration 796/1000, Loss: nan\n", "Iteration 797/1000, Loss: nan\n", "Iteration 798/1000, Loss: nan\n", "Iteration 799/1000, Loss: nan\n", "Iteration 800/1000, Loss: nan\n", "Iteration 801/1000, Loss: nan\n", "Iteration 802/1000, Loss: nan\n", "Iteration 803/1000, Loss: nan\n", "Iteration 804/1000, Loss: nan\n", "Iteration 805/1000, Loss: nan\n", "Iteration 806/1000, Loss: nan\n", "Iteration 807/1000, Loss: nan\n", "Iteration 808/1000, Loss: nan\n", "Iteration 809/1000, Loss: nan\n", "Iteration 810/1000, Loss: nan\n", "Iteration 811/1000, Loss: nan\n", "Iteration 812/1000, Loss: nan\n", "Iteration 813/1000, Loss: nan\n", "Iteration 814/1000, Loss: nan\n", "Iteration 815/1000, Loss: nan\n", "Iteration 816/1000, Loss: nan\n", "Iteration 817/1000, Loss: nan\n", "Iteration 818/1000, Loss: nan\n", "Iteration 819/1000, Loss: nan\n", "Iteration 820/1000, Loss: nan\n", "Iteration 821/1000, Loss: nan\n", "Iteration 822/1000, Loss: nan\n", "Iteration 823/1000, Loss: nan\n", "Iteration 824/1000, Loss: nan\n", "Iteration 825/1000, Loss: nan\n", "Iteration 826/1000, Loss: nan\n", "Iteration 827/1000, Loss: nan\n", "Iteration 828/1000, Loss: nan\n", "Iteration 829/1000, Loss: nan\n", "Iteration 830/1000, Loss: nan\n", "Iteration 831/1000, Loss: nan\n", "Iteration 832/1000, Loss: nan\n", "Iteration 833/1000, Loss: nan\n", "Iteration 834/1000, Loss: nan\n", "Iteration 835/1000, Loss: nan\n", "Iteration 836/1000, Loss: nan\n", "Iteration 837/1000, Loss: nan\n", "Iteration 838/1000, Loss: nan\n", "Iteration 839/1000, Loss: nan\n", "Iteration 840/1000, Loss: nan\n", "Iteration 841/1000, Loss: nan\n", "Iteration 842/1000, Loss: nan\n", "Iteration 843/1000, Loss: nan\n", "Iteration 844/1000, Loss: nan\n", "Iteration 845/1000, Loss: nan\n", "Iteration 846/1000, Loss: nan\n", "Iteration 847/1000, Loss: nan\n", "Iteration 848/1000, Loss: nan\n", "Iteration 849/1000, Loss: nan\n", "Iteration 850/1000, Loss: nan\n", "Iteration 851/1000, Loss: nan\n", "Iteration 852/1000, Loss: nan\n", "Iteration 853/1000, Loss: nan\n", "Iteration 854/1000, Loss: nan\n", "Iteration 855/1000, Loss: nan\n", "Iteration 856/1000, Loss: nan\n", "Iteration 857/1000, Loss: nan\n", "Iteration 858/1000, Loss: nan\n", "Iteration 859/1000, Loss: nan\n", "Iteration 860/1000, Loss: nan\n", "Iteration 861/1000, Loss: nan\n", "Iteration 862/1000, Loss: nan\n", "Iteration 863/1000, Loss: nan\n", "Iteration 864/1000, Loss: nan\n", "Iteration 865/1000, Loss: nan\n", "Iteration 866/1000, Loss: nan\n", "Iteration 867/1000, Loss: nan\n", "Iteration 868/1000, Loss: nan\n", "Iteration 869/1000, Loss: nan\n", "Iteration 870/1000, Loss: nan\n", "Iteration 871/1000, Loss: nan\n", "Iteration 872/1000, Loss: nan\n", "Iteration 873/1000, Loss: nan\n", "Iteration 874/1000, Loss: nan\n", "Iteration 875/1000, Loss: nan\n", "Iteration 876/1000, Loss: nan\n", "Iteration 877/1000, Loss: nan\n", "Iteration 878/1000, Loss: nan\n", "Iteration 879/1000, Loss: nan\n", "Iteration 880/1000, Loss: nan\n", "Iteration 881/1000, Loss: nan\n", "Iteration 882/1000, Loss: nan\n", "Iteration 883/1000, Loss: nan\n", "Iteration 884/1000, Loss: nan\n", "Iteration 885/1000, Loss: nan\n", "Iteration 886/1000, Loss: nan\n", "Iteration 887/1000, Loss: nan\n", "Iteration 888/1000, Loss: nan\n", "Iteration 889/1000, Loss: nan\n", "Iteration 890/1000, Loss: nan\n", "Iteration 891/1000, Loss: nan\n", "Iteration 892/1000, Loss: nan\n", "Iteration 893/1000, Loss: nan\n", "Iteration 894/1000, Loss: nan\n", "Iteration 895/1000, Loss: nan\n", "Iteration 896/1000, Loss: nan\n", "Iteration 897/1000, Loss: nan\n", "Iteration 898/1000, Loss: nan\n", "Iteration 899/1000, Loss: nan\n", "Iteration 900/1000, Loss: nan\n", "Iteration 901/1000, Loss: nan\n", "Iteration 902/1000, Loss: nan\n", "Iteration 903/1000, Loss: nan\n", "Iteration 904/1000, Loss: nan\n", "Iteration 905/1000, Loss: nan\n", "Iteration 906/1000, Loss: nan\n", "Iteration 907/1000, Loss: nan\n", "Iteration 908/1000, Loss: nan\n", "Iteration 909/1000, Loss: nan\n", "Iteration 910/1000, Loss: nan\n", "Iteration 911/1000, Loss: nan\n", "Iteration 912/1000, Loss: nan\n", "Iteration 913/1000, Loss: nan\n", "Iteration 914/1000, Loss: nan\n", "Iteration 915/1000, Loss: nan\n", "Iteration 916/1000, Loss: nan\n", "Iteration 917/1000, Loss: nan\n", "Iteration 918/1000, Loss: nan\n", "Iteration 919/1000, Loss: nan\n", "Iteration 920/1000, Loss: nan\n", "Iteration 921/1000, Loss: nan\n", "Iteration 922/1000, Loss: nan\n", "Iteration 923/1000, Loss: nan\n", "Iteration 924/1000, Loss: nan\n", "Iteration 925/1000, Loss: nan\n", "Iteration 926/1000, Loss: nan\n", "Iteration 927/1000, Loss: nan\n", "Iteration 928/1000, Loss: nan\n", "Iteration 929/1000, Loss: nan\n", "Iteration 930/1000, Loss: nan\n", "Iteration 931/1000, Loss: nan\n", "Iteration 932/1000, Loss: nan\n", "Iteration 933/1000, Loss: nan\n", "Iteration 934/1000, Loss: nan\n", "Iteration 935/1000, Loss: nan\n", "Iteration 936/1000, Loss: nan\n", "Iteration 937/1000, Loss: nan\n", "Iteration 938/1000, Loss: nan\n", "Iteration 939/1000, Loss: nan\n", "Iteration 940/1000, Loss: nan\n", "Iteration 941/1000, Loss: nan\n", "Iteration 942/1000, Loss: nan\n", "Iteration 943/1000, Loss: nan\n", "Iteration 944/1000, Loss: nan\n", "Iteration 945/1000, Loss: nan\n", "Iteration 946/1000, Loss: nan\n", "Iteration 947/1000, Loss: nan\n", "Iteration 948/1000, Loss: nan\n", "Iteration 949/1000, Loss: nan\n", "Iteration 950/1000, Loss: nan\n", "Iteration 951/1000, Loss: nan\n", "Iteration 952/1000, Loss: nan\n", "Iteration 953/1000, Loss: nan\n", "Iteration 954/1000, Loss: nan\n", "Iteration 955/1000, Loss: nan\n", "Iteration 956/1000, Loss: nan\n", "Iteration 957/1000, Loss: nan\n", "Iteration 958/1000, Loss: nan\n", "Iteration 959/1000, Loss: nan\n", "Iteration 960/1000, Loss: nan\n", "Iteration 961/1000, Loss: nan\n", "Iteration 962/1000, Loss: nan\n", "Iteration 963/1000, Loss: nan\n", "Iteration 964/1000, Loss: nan\n", "Iteration 965/1000, Loss: nan\n", "Iteration 966/1000, Loss: nan\n", "Iteration 967/1000, Loss: nan\n", "Iteration 968/1000, Loss: nan\n", "Iteration 969/1000, Loss: nan\n", "Iteration 970/1000, Loss: nan\n", "Iteration 971/1000, Loss: nan\n", "Iteration 972/1000, Loss: nan\n", "Iteration 973/1000, Loss: nan\n", "Iteration 974/1000, Loss: nan\n", "Iteration 975/1000, Loss: nan\n", "Iteration 976/1000, Loss: nan\n", "Iteration 977/1000, Loss: nan\n", "Iteration 978/1000, Loss: nan\n", "Iteration 979/1000, Loss: nan\n", "Iteration 980/1000, Loss: nan\n", "Iteration 981/1000, Loss: nan\n", "Iteration 982/1000, Loss: nan\n", "Iteration 983/1000, Loss: nan\n", "Iteration 984/1000, Loss: nan\n", "Iteration 985/1000, Loss: nan\n", "Iteration 986/1000, Loss: nan\n", "Iteration 987/1000, Loss: nan\n", "Iteration 988/1000, Loss: nan\n", "Iteration 989/1000, Loss: nan\n", "Iteration 990/1000, Loss: nan\n", "Iteration 991/1000, Loss: nan\n", "Iteration 992/1000, Loss: nan\n", "Iteration 993/1000, Loss: nan\n", "Iteration 994/1000, Loss: nan\n", "Iteration 995/1000, Loss: nan\n", "Iteration 996/1000, Loss: nan\n", "Iteration 997/1000, Loss: nan\n", "Iteration 998/1000, Loss: nan\n", "Iteration 999/1000, Loss: nan\n", "Iteration 1000/1000, Loss: nan\n" ] }, { "name": "stderr", "output_type": "stream", "text": [ "c:\\Users\\pthomas\\AppData\\Local\\Programs\\Python\\Python311\\Lib\\site-packages\\matplotlib\\scale.py:255: RuntimeWarning: overflow encountered in power\n", " return np.power(self.base, values)\n" ] }, { "data": { "image/png": "iVBORw0KGgoAAAANSUhEUgAAAlkAAAHHCAYAAACMfE3pAAAAOXRFWHRTb2Z0d2FyZQBNYXRwbG90bGliIHZlcnNpb24zLjguMCwgaHR0cHM6Ly9tYXRwbG90bGliLm9yZy81sbWrAAAACXBIWXMAAA9hAAAPYQGoP6dpAABUT0lEQVR4nO3dd1gU5/428HsB6bCEIggKKKICKhoUQowNiEgMiHqiUaOINQZjFD2Wc362HLvHFiXWKMaoQWNLTCyI2BsKJPaKDQXFgoKFss/7Rw77ui4oizuui/fnuva63GeemfnODrt7O/PsjEwIIUBEREREWmWg6wKIiIiIKiOGLCIiIiIJMGQRERERSYAhi4iIiEgCDFlEREREEmDIIiIiIpIAQxYRERGRBBiyiIiIiCTAkEVEREQkAYYsqvR69eoFd3d3lTaZTIbx48frpB4ibRg/fjxkMpmuy9CKK1euQCaTIT4+XuN54+PjIZPJcOXKFa3XRfS6GLJIMhkZGRg0aBDq1KkDc3NzmJubw9vbGzExMfjrr790XZ7kVq9ejTlz5pS7v7u7O2QyGWQyGQwMDGBjY4MGDRqgf//+OHLkiHSF6tDNmzcxfvx4pKenl6t/yRfqsWPHpC1MAiVBouRhaGgIV1dXdOjQodzbTxVXEkpLHubm5nB1dUV4eDiWL1+OZ8+e6bpEnXn8+DHi4uLQpk0bVKtWDVZWVmjcuDEWLFiA4uJiXZen14x0XQBVTlu2bEGXLl1gZGSE7t27w9fXFwYGBjh79iw2bNiABQsWICMjA25ubjqp78mTJzAykvbPf/Xq1Th58iSGDBlS7nkaNWqEYcOGAQAePXqEM2fOYN26dViyZAmGDh2KWbNmSVStbty8eRMTJkyAu7s7GjVqpOty3oiuXbvik08+QXFxMc6cOYMFCxZg69atOHz48DvzGrzIzc0NT548QZUqVSRf14IFC2BpaYlnz54hMzMT27dvR+/evTFnzhxs2bIFNWrUkLyGt83ly5fx9ddfIzg4GLGxsbC2tsb27dvx1Vdf4fDhw1ixYoWuS9RbDFmkdZcuXcLnn38ONzc3JCUloVq1airTp02bhu+//x4GBi8/kJqfnw8LCwtJajQ1NZVkua/LxcUFX3zxhUrbtGnT0K1bN8yePRuenp4YOHCgjqojbXj//fdV9nGzZs0QERGBBQsWYNGiRTqsTHdkMtkbe0/+4x//gL29vfL52LFjsWrVKvTs2ROfffYZDh8+/EbqKKFQKFBQUKDTzyQnJyecOHECPj4+yrYBAwagd+/eWL58OcaMGYPatWvrrD59xtOFpHXTp09Hfn4+li9frhawAMDIyAiDBw9W+R9jr169YGlpiUuXLuGTTz6BlZUVunfvDgDYt28fPvvsM7i6usLExAQ1atTA0KFD8eTJE7Vlb9q0CfXr14epqSnq16+PjRs3llpjaWOyMjMz0bt3bzg6OsLExAQ+Pj5YtmyZSp/du3dDJpNh7dq1mDRpEqpXrw5TU1MEBwfj4sWLyn6tWrXC77//jqtXrypPT7w4Lqy8zMzMsHLlStja2mLSpEkQQiinKRQKzJkzBz4+PjA1NYWjoyMGDBiA+/fvqyzj2LFjCA0Nhb29PczMzFCzZk307t1bpY9CocDcuXPRoEEDmJqawsHBAW3btlU7NffTTz/Bz88PZmZmsLW1xeeff47r16+r9GnVqhXq16+P06dPo3Xr1jA3N4eLiwumT5+u8lo2bdoUABAdHa18nSoyLudFaWlpCAsLg7W1NSwtLREcHKz25VlYWIgJEybA09MTpqamsLOzw0cffYTExERln6ysLERHR6N69eowMTFBtWrV0L59e62O/wkKCgLw9+n1EuvWrVO+xvb29vjiiy+QmZn50uW0bNkSvr6+pU6rW7cuQkNDAfz/05b//e9/sXjxYnh4eMDExARNmzZFSkqK2ry7du1C8+bNYWFhARsbG7Rv3x5nzpxR6VNyKu78+fP44osvIJfL4eDggDFjxkAIgevXr6N9+/awtraGk5MTZs6cqTJ/aWOy/vrrL/Tq1Qu1atWCqakpnJyc0Lt3b9y9e/elr0NFdO/eHX379sWRI0dU9j8AHDlyBG3btoVcLoe5uTlatmyJAwcOqC1j9+7daNKkCUxNTeHh4YFFixaVOm5OJpNh0KBBWLVqFXx8fGBiYoJt27YBKN9nEAA8e/YM48aNQ+3atZWfiSNGjFA75ZmTk4OzZ8/i8ePHL91+e3t7lYBVokOHDgCgtr+p/Hgki7Ruy5YtqF27NgICAjSar6ioCKGhofjoo4/w3//+F+bm5gD+/sJ5/PgxBg4cCDs7Oxw9ehTz5s3DjRs3sG7dOuX8O3bsQKdOneDt7Y0pU6bg7t27yi/IV8nOzsYHH3yg/AB0cHDA1q1b0adPHzx8+FDtlN/UqVNhYGCA4cOHIzc3F9OnT0f37t2VY6f+/e9/Izc3Fzdu3MDs2bMBAJaWlhq9Hs+ztLREhw4d8MMPP+D06dPKD8QBAwYgPj4e0dHRGDx4MDIyMjB//nykpaXhwIEDqFKlCm7fvo02bdrAwcEBo0aNgo2NDa5cuYINGzaorKNPnz6Ij49HWFgY+vbti6KiIuzbtw+HDx9GkyZNAACTJk3CmDFj0LlzZ/Tt2xd37tzBvHnz0KJFC6SlpcHGxka5vPv376Nt27bo2LEjOnfujF9++QUjR45EgwYNEBYWBi8vL3z77bcYO3Ys+vfvj+bNmwMAPvzwwwq/TgBw6tQpNG/eHNbW1hgxYgSqVKmCRYsWoVWrVtizZ4/y73L8+PGYMmUK+vbtC39/fzx8+BDHjh1DamoqPv74YwBAp06dcOrUKXz99ddwd3fH7du3kZiYiGvXrlU4NL/o0qVLAAA7OzsAUO7Ppk2bYsqUKcjOzsbcuXNx4MABtdf4eT169EC/fv1w8uRJ1K9fX9mekpKC8+fP4//+7/9U+q9evRqPHj3CgAEDIJPJMH36dHTs2BGXL19WnrbbuXMnwsLCUKtWLYwfPx5PnjzBvHnz0KxZM6Smpqq9Bl26dIGXlxemTp2K33//HRMnToStrS0WLVqEoKAgTJs2DatWrcLw4cPRtGlTtGjRoszXJTExEZcvX0Z0dDScnJxw6tQpLF68GKdOncLhw4e1Pui/R48eWLx4MXbs2KHc/7t27UJYWBj8/Pwwbtw4GBgYYPny5QgKCsK+ffvg7+8P4O9Q37ZtW1SrVg0TJkxAcXExvv32Wzg4OJS6rl27dmHt2rUYNGgQ7O3t4e7uXu7PIIVCgYiICOzfvx/9+/eHl5cXTpw4gdmzZ+P8+fPYtGmTcj3z58/HhAkTkJycjFatWmn8mmRlZQGAypE/0pAg0qLc3FwBQERGRqpNu3//vrhz547y8fjxY+W0qKgoAUCMGjVKbb7n+5WYMmWKkMlk4urVq8q2Ro0aiWrVqokHDx4o23bs2CEACDc3N5X5AYhx48Ypn/fp00dUq1ZN5OTkqPT7/PPPhVwuV9aQnJwsAAgvLy/x7NkzZb+5c+cKAOLEiRPKtnbt2qmt92Xc3NxEu3btypw+e/ZsAUBs3rxZCCHEvn37BACxatUqlX7btm1Tad+4caMAIFJSUspc9q5duwQAMXjwYLVpCoVCCCHElStXhKGhoZg0aZLK9BMnTggjIyOV9pYtWwoA4scff1S2PXv2TDg5OYlOnTop21JSUgQAsXz58jJre97y5ctfuS2RkZHC2NhYXLp0Sdl28+ZNYWVlJVq0aKFs8/X1fenrff/+fQFAzJgxo1y1vUpGRoYAICZMmCDu3LkjsrKyxO7du0Xjxo0FALF+/XpRUFAgqlatKurXry+ePHminHfLli0CgBg7dqyybdy4ceL5j/AHDx4IU1NTMXLkSJX1Dh48WFhYWIi8vDyVOuzs7MS9e/eU/TZv3iwAiN9++03Z1qhRI1G1alVx9+5dZduff/4pDAwMRM+ePdVq6d+/v7KtqKhIVK9eXchkMjF16lRl+/3794WZmZmIiopSe22e/zso7X2/Zs0aAUDs3btX2VbyN5GRkaHW/3klNd65c6fU6SX7u0OHDkKIv//uPT09RWhoqPI9UFJXzZo1xccff6xsCw8PF+bm5iIzM1PZduHCBWFkZCRe/JoFIAwMDMSpU6dU2sv7GbRy5UphYGAg9u3bp9Jv4cKFAoA4cOCA2jYnJye/9LUpzbNnz4S3t7eoWbOmKCws1Hh++htPF5JWPXz4EEDpR21atWoFBwcH5SMuLk6tT2njjczMzJT/zs/PR05ODj788EMIIZCWlgYAuHXrFtLT0xEVFQW5XK7s//HHH8Pb2/ulNQshsH79eoSHh0MIgZycHOUjNDQUubm5SE1NVZknOjoaxsbGyuclR2EuX7780nW9jpLX9NGjRwD+PsInl8vx8ccfq9Ts5+cHS0tLJCcnA4DyyMeWLVtQWFhY6rLXr18PmUyGcePGqU0rOWKwYcMGKBQKdO7cWWV9Tk5O8PT0VK7v+XqfH3tkbGwMf39/SV+j4uJi7NixA5GRkahVq5ayvVq1aujWrRv279+v/Bu1sbHBqVOncOHChVKXZWZmBmNjY+zevVvt9OvrGDduHBwcHODk5IRWrVrh0qVLmDZtGjp27Ihjx47h9u3b+Oqrr1TG6LRr1w716tXD77//XuZy5XI52rdvjzVr1ihPKRcXFyMhIQGRkZFq4xu7dOmC9957T/n8xb/hkvdUr169YGtrq+zXsGFDfPzxx/jjjz/Uaujbt6/y34aGhmjSpAmEEOjTp4+y3cbGBnXr1n3l38Hz7/unT58iJycHH3zwAQCovR+14cX3V3p6Oi5cuIBu3brh7t27yr/3/Px8BAcHY+/evVAoFCguLsbOnTsRGRkJZ2dn5fJq166NsLCwUtfVsmVLlc8lTT6D1q1bBy8vL9SrV0+lX8lp5+ffh+PHj4cQokJHsQYNGoTTp09j/vz5kv9IqDLjK0daZWVlBQDIy8tTm7Zo0SI8evQI2dnZaoO7gb/HapV2au/atWsYO3Ysfv31V7Uvu9zcXADA1atXAQCenp5q89etW/elH8p37tzBgwcPsHjxYixevLjUPrdv31Z57urqqvK85MtKm1/GLyp5TUte4wsXLiA3NxdVq1YttX9JzS1btkSnTp0wYcIEzJ49G61atUJkZCS6desGExMTAH+fsnJ2dlb5Mn3RhQsXIIQo9TUGoPbLsOrVq6ud0nnvvfckvXzHnTt38PjxY9StW1dtmpeXFxQKBa5fvw4fHx98++23aN++PerUqYP69eujbdu26NGjBxo2bAgAMDExwbRp0zBs2DA4Ojrigw8+wKeffoqePXvCycmpwjX2798fn332mfIyHSXjcoD//3dcWv316tXD/v37X7rsnj17IiEhAfv27UOLFi2wc+dOZGdno0ePHmp9X/U3/LJavLy8sH37drUfp7y4TLlcDlNTU7XTTXK5/JVjq+7du4cJEybg559/Vnv/lbzvtam09xcAREVFlTlPbm4unj59iidPnpQ6MLysweI1a9ZUea7JZ9CFCxdw5syZMk9FvvhaVcSMGTOwZMkS/Oc//8Enn3zy2st7lzFkkVbJ5XJUq1YNJ0+eVJtWMhamrEHDJiYmar84LC4uxscff4x79+5h5MiRqFevHiwsLJCZmYlevXpBoVC8ds0ly/jiiy/K/EAt+eItYWhoWGo/8dygdG0reU1LPrgVCgWqVq2KVatWldq/5ENYJpPhl19+weHDh/Hbb78pf7I+c+ZMHD58uNxjxRQKBWQyGbZu3Vrq9r+4HF28Rppo0aIFLl26hM2bN2PHjh1YunQpZs+ejYULFyqPyAwZMgTh4eHYtGkTtm/fjjFjxmDKlCnYtWsXGjduXKH1enp6IiQkRJubohQaGgpHR0f89NNPaNGiBX766Sc4OTmVuj4p9k9py6zoejp37oyDBw/in//8Jxo1agRLS0soFAq0bdtWK+/7F5X2/gL+DhxlXVrD0tIST58+1Xhdzx+le35d5fkMUigUaNCgQZmXc3ndS1DEx8dj5MiR+PLLL9XG8ZHmGLJI69q1a4elS5fi6NGjyoGhFXXixAmcP38eK1asQM+ePZXtL/4CqOR6W6Wd+jl37txL1+Hg4AArKysUFxdr9ctPmwNz8/LysHHjRtSoUQNeXl4AAA8PD+zcuRPNmjVT+9AuzQcffIAPPvgAkyZNwurVq9G9e3f8/PPP6Nu3Lzw8PLB9+3bcu3evzKNZHh4eEEKgZs2aqFOnjla2S9uDlx0cHGBubl7qPj979iwMDAxUvoRsbW0RHR2N6Oho5OXloUWLFhg/frzKaS8PDw8MGzYMw4YNw4ULF9CoUSPMnDkTP/30k1ZrB/7/3/G5c+eUp39KnDt37pXXlTM0NES3bt0QHx+PadOmYdOmTejXr1+ZQae8tbzo7NmzsLe3l+wSK/fv30dSUhImTJiAsWPHKtvLOrWrDStXrgQA5a8wPTw8AADW1tYv/VyoWrUqTE1NVX5dXKK0ttJo8hnk4eGBP//8E8HBwVp//2zevBl9+/ZFx44dSx3OQZrjmCzSuhEjRsDc3By9e/dGdna22nRN/qdc8uXw/DxCCMydO1elX7Vq1dCoUSOsWLFC5VRCYmIiTp8+/cp1dOrUCevXry/1CNydO3fKXe/zLCwstHJa48mTJ+jRowfu3buHf//738oP1s6dO6O4uBj/+c9/1OYpKirCgwcPAPz9hfXia17yP/OSn3x36tQJQghMmDBBbVkl83bs2BGGhoaYMGGC2vKEEBX6aX3Jl3RJra/L0NAQbdq0webNm1WOmGZnZ2P16tX46KOPYG1tDQBq9VpaWqJ27drK1+Tx48dqRyk8PDxgZWUl2dXBmzRpgqpVq2LhwoUq69i6dSvOnDmDdu3avXIZPXr0wP379zFgwADk5eWVemq+PJ5/Tz2/f06ePIkdO3ZIehqptPc9AI3uoKCJ1atXY+nSpQgMDERwcDAAwM/PDx4eHvjvf/9b6vCHks8FQ0NDhISEYNOmTbh586Zy+sWLF7F169ZyrV+Tz6DOnTsjMzMTS5YsUev35MkT5OfnK5+X9xIOALB37158/vnnaNGiBVatWvXK6xhS+fBIFmmdp6cnVq9eja5du6Ju3brKK74LIZCRkYHVq1fDwMCgXJdWqFevHjw8PDB8+HBkZmbC2toa69evL3Xs05QpU9CuXTt89NFH6N27N+7du4d58+bBx8en1A/J502dOhXJyckICAhAv3794O3tjXv37iE1NRU7d+7EvXv3NH4d/Pz8kJCQgNjYWDRt2hSWlpYIDw9/6TyZmZnKIyR5eXk4ffo01q1bh6ysLAwbNgwDBgxQ9m3ZsiUGDBiAKVOmID09HW3atEGVKlVw4cIFrFu3DnPnzsU//vEPrFixAt9//z06dOgADw8PPHr0CEuWLIG1tbXyi7J169bo0aMHvvvuO1y4cEF5Smbfvn1o3bo1Bg0aBA8PD0ycOBGjR4/GlStXEBkZCSsrK2RkZGDjxo3o378/hg8frtFr5OHhARsbGyxcuBBWVlawsLBAQECA2piVFy1btkx5baHnffPNN5g4cSISExPx0Ucf4auvvoKRkREWLVqEZ8+eqVyny9vbG61atYKfnx9sbW1x7Ngx/PLLLxg0aBAA4Pz58wgODkbnzp3h7e0NIyMjbNy4EdnZ2fj888+Vyym55MLy5cvRq1cvjbb/RVWqVMG0adMQHR2Nli1bomvXrspLOLi7u2Po0KGvXEbjxo1Rv3595QDp999/v8L1zJgxA2FhYQgMDESfPn2Ul3CQy+WS3vvT2toaLVq0wPTp01FYWAgXFxfs2LFD5VpiFfXLL7/A0tISBQUFyiu+HzhwAL6+viqXhDEwMMDSpUsRFhYGHx8fREdHw8XFBZmZmUhOToa1tTV+++03AH8PMN+xYweaNWuGgQMHori4GPPnz0f9+vXLfcuk8n4G9ejRA2vXrsWXX36J5ORkNGvWDMXFxTh79izWrl2L7du3Ky+5Ut5LOFy9ehURERGQyWT4xz/+ofI6AH+fqnxxyASV05v7ISO9ay5evCgGDhwoateuLUxNTYWZmZmoV6+e+PLLL0V6erpK36ioKGFhYVHqck6fPi1CQkKEpaWlsLe3F/369RN//vlnqT/9X79+vfDy8hImJibC29tbbNiwQURFRb3yEg5CCJGdnS1iYmJEjRo1RJUqVYSTk5MIDg4WixcvVvYpuYTDunXrVOYt7SfoeXl5olu3bsLGxqbUy0i8yM3NTQAQAIRMJhPW1tbCx8dH9OvXTxw5cqTM+RYvXiz8/PyEmZmZsLKyEg0aNBAjRowQN2/eFEIIkZqaKrp27SpcXV2FiYmJqFq1qvj000/FsWPHVJZTVFQkZsyYIerVqyeMjY2Fg4ODCAsLE8ePH1d7jT/66CNhYWEhLCwsRL169URMTIw4d+6csk/Lli2Fj4+PWq2l7YvNmzcLb29v5c/dX3Y5h5Kf65f1uH79unKbQ0NDhaWlpTA3NxetW7cWBw8eVFnWxIkThb+/v7CxsVH+bU6aNEkUFBQIIYTIyckRMTExol69esLCwkLI5XIREBAg1q5dq7KcefPmCQBi27ZtZdYtxP//GynPJSESEhJE48aNhYmJibC1tRXdu3cXN27cUOnz4iUcnjd9+nQBQEyePFmjOkp7X+zcuVM0a9ZMmJmZCWtraxEeHi5Onz5dai0vXh6hrPf1i38fpb1/bty4ITp06CBsbGyEXC4Xn332mbh586ZajZpewqHkYWpqKqpXry4+/fRTsWzZMvH06dNS50tLSxMdO3YUdnZ2wsTERLi5uYnOnTuLpKQklX5JSUmicePGwtjYWHh4eIilS5eKYcOGCVNTU5V+AERMTEyp6yrPZ5AQQhQUFIhp06YJHx8fYWJiIt577z3h5+cnJkyYIHJzc9W2+VWXcCj5XCvr8eLfBJWfTIi3ZBQqEZEe6ty5M65cuYKjR4/quhSluXPnYujQobhy5YraL/7ozYmMjHzpZUKo8uNJVyKiChJCYPfu3Zg4caKuS1ESQuCHH35Ay5YtGbDeoBdv83XhwgX88ccfFbpGFVUeHJP1Ch06dMDu3bsRHByMX375RdflENFbRCaTaeW6RNqQn5+PX3/9FcnJyThx4gQ2b96s65LeKbVq1VLea/Hq1atYsGABjI2NMWLECF2XRjrE04WvsHv3bjx69AgrVqxgyCKit9aVK1dQs2ZN2NjY4KuvvsKkSZN0XdI7JTo6GsnJycjKyoKJiQkCAwMxefLk1/rhAek/hqxy2L17N+bPn8+QRUREROVWqcdk7d27F+Hh4XB2doZMJlO5O3mJuLg4uLu7w9TUFAEBAW/V4FUiIiLSX5U6ZOXn58PX17fMK9eWXMNo3LhxSE1Nha+vL0JDQ9+aMRZERESkvyr1wPewsLAy74IOALNmzUK/fv0QHR0NAFi4cCF+//13LFu2DKNGjdJ4fc+ePVO5SrNCocC9e/dgZ2en9dsfEBERkTSEEHj06BGcnZ1f6+r3lTpkvUxBQQGOHz+O0aNHK9sMDAwQEhKCQ4cOVWiZU6ZMKfW2JERERKR/rl+/Xq67k5TlnQ1ZOTk5KC4uhqOjo0q7o6Mjzp49q3weEhKCP//8E/n5+ahevTrWrVuHwMDAUpc5evRoxMbGKp/n5ubC1dUV169fV94vjYiIiN5uDx8+RI0aNWBlZfVay3lnQ1Z57dy5s9x9TUxMYGJiotZubW3NkEVERKRnXneoT6Ue+P4y9vb2MDQ0RHZ2tkp7dnY2nJycdFQVERERVRbvbMgyNjaGn58fkpKSlG0KhQJJSUllng4kIiIiKq9KfbowLy8PFy9eVD7PyMhAeno6bG1t4erqitjYWERFRaFJkybw9/fHnDlzkJ+fr/y1IREREVFFVeqQdezYMbRu3Vr5vGRQelRUFOLj49GlSxfcuXMHY8eORVZWFho1aoRt27apDYYnIiIi0hRvqyOhhw8fQi6XIzc3lwPfiYiI9IS2vr/f2TFZRERERFJiyCIiIiKSAEMWERERkQQYsoiIiIgkwJBFREREJAGGLCIiIiIJMGQRERERSYAhi4iIiEgCDFlEREREEmDIkkBcXBy8vb3RtGlTXZdCREREOsLb6kiIt9UhIiLSP7ytDhEREdFbjCGLiIiISAIMWUREREQSYMgiIiIikgBDFhEREZEEGLKIiIiIJMCQRURERCQBhiwiIiIiCTBkEREREUmAIYuIiIhIAgxZRERERBJgyCIiIiKSAEMWERERkQQYsoiIiIgkwJBFREREJAGGLCIiIiIJMGQRERERSYAhSwJxcXHw9vZG06ZNdV0KERER6YhMCCF0XURl9fDhQ8jlcuTm5sLa2lrX5RAREVE5aOv7m0eyiIiIiCTAkEVEREQkAYYsIiIiIgkwZBERERFJgCGLiIiISAIMWUREREQSYMgiIiIikgBDFhEREZEEGLKIiIiIJMCQRURERCQBhiwiIiIiCTBkEREREUmAIYuIiIhIAgxZRERERBJgyCIiIiKSAEMWERERkQQYsoiIiIgkwJBFREREJAGGLCIiIiIJMGRJIC4uDt7e3mjatKmuSyEiIiIdkQkhhK6LqKwePnwIuVyO3NxcWFtb67ocIiIiKgdtfX/zSBYRERGRBBiyiIiIiCTAkEVEREQkAYYsIiIiIgkwZBERERFJgCGLiIiISAIMWUREREQSYMgiIiIikgBDFhEREZEEGLKIiIiIJMCQRURERCQBhiwiIiIiCTBkEREREUmAIYuIiIhIAgxZRERERBJgyCIiIiKSAEMWERERkQQYsjSwZcsW1K1bF56enli6dKmuyyEiIqK3mJGuC9AXRUVFiI2NRXJyMuRyOfz8/NChQwfY2dnpujQiIiJ6C/FIVjkdPXoUPj4+cHFxgaWlJcLCwrBjxw5dl0VERERvqbciZGVmZuKLL76AnZ0dzMzM0KBBAxw7dkxry9+7dy/Cw8Ph7OwMmUyGTZs2ldovLi4O7u7uMDU1RUBAAI4ePaqcdvPmTbi4uCifu7i4IDMzU2s1EhERUeWi85B1//59NGvWDFWqVMHWrVtx+vRpzJw5E++9916p/Q8cOIDCwkK19tOnTyM7O7vUefLz8+Hr64u4uLgy60hISEBsbCzGjRuH1NRU+Pr6IjQ0FLdv367YhhEREdE7Techa9q0aahRowaWL18Of39/1KxZE23atIGHh4daX4VCgZiYGHTr1g3FxcXK9nPnziEoKAgrVqwodR1hYWGYOHEiOnToUGYds2bNQr9+/RAdHQ1vb28sXLgQ5ubmWLZsGQDA2dlZ5chVZmYmnJ2dK7rZREREVMnpPGT9+uuvaNKkCT777DNUrVoVjRs3xpIlS0rta2BggD/++ANpaWno2bMnFAoFLl26hKCgIERGRmLEiBEVqqGgoADHjx9HSEiIyrpCQkJw6NAhAIC/vz9OnjyJzMxM5OXlYevWrQgNDS11eXFxcfD29kbTpk0rVA8RERHpP52HrMuXL2PBggXw9PTE9u3bMXDgQAwePLjMo1LOzs7YtWsX9u/fj27duiEoKAghISFYsGBBhWvIyclBcXExHB0dVdodHR2RlZUFADAyMsLMmTPRunVrNGrUCMOGDSvzl4UxMTE4ffo0UlJSKlwTERER6TedX8JBoVCgSZMmmDx5MgCgcePGOHnyJBYuXIioqKhS53F1dcXKlSvRsmVL1KpVCz/88ANkMpnktUZERCAiIkLy9RAREZH+0/mRrGrVqsHb21ulzcvLC9euXStznuzsbPTv3x/h4eF4/Pgxhg4d+lo12Nvbw9DQUG3gfHZ2NpycnF5r2URERPRu0nnIatasGc6dO6fSdv78ebi5uZXaPycnB8HBwfDy8sKGDRuQlJSEhIQEDB8+vMI1GBsbw8/PD0lJSco2hUKBpKQkBAYGVni5RERE9O7S+enCoUOH4sMPP8TkyZPRuXNnHD16FIsXL8bixYvV+ioUCoSFhcHNzQ0JCQkwMjKCt7c3EhMTERQUBBcXl1KPauXl5eHixYvK5xkZGUhPT4etrS1cXV0BALGxsYiKikKTJk3g7++POXPmID8/H9HR0dJtPBEREVVaMiGE0HURW7ZswejRo3HhwgXUrFkTsbGx6NevX6l9ExMT0bx5c5iamqq0p6WlwcHBAdWrV1ebZ/fu3WjdurVae1RUFOLj45XP58+fjxkzZiArKwuNGjXCd999h4CAgApv18OHDyGXy5Gbmwtra+sKL4eIiIjeHG19f78VIauyYsgiIiLSP9r6/tb5mCwiIiKiyoghi4iIiEgCDFlEREREEmDIIiIiIpIAQxYRERGRBBiyiIiIiCTAkEVEREQkAYYsIiIiIgkwZBERERFJgCGLiIiISAIMWUREREQSYMgiIiIikgBDFhEREZEEGLKIiIiIJMCQRURERCQBhiwiIiIiCTBkEREREUmAIYuIiIhIAgxZRERERBJgyCIiIiKSAEMWERERkQQYsiQQFxcHb29vNG3aVNelEBERkY7IhBBC10VUVg8fPoRcLkdubi6sra11XQ4RERGVg7a+v3kki4iIiEgCDFlEREREEmDIIiIiIpKARiGrqKgI3377LW7cuCFVPURERESVgkYhy8jICDNmzEBRUZFU9RARERFVChqfLgwKCsKePXukqIWIiIio0jDSdIawsDCMGjUKJ06cgJ+fHywsLFSmR0REaK04IiIiIn2l8XWyDAzKPvglk8lQXFz82kVVFrxOFhERkf7R1ve3xkeyFApFhVdGRERE9K7gJRyIiIiIJFChkLVnzx6Eh4ejdu3aqF27NiIiIrBv3z5t10ZERESktzQOWT/99BNCQkJgbm6OwYMHY/DgwTAzM0NwcDBWr14tRY1EREREekfjge9eXl7o378/hg4dqtI+a9YsLFmyBGfOnNFqgfqMA9+JiIj0j85uEH358mWEh4ertUdERCAjI6PChRARERFVJhqHrBo1aiApKUmtfefOnahRo4ZWiiIiIiLSdxpfwmHYsGEYPHgw0tPT8eGHHwIADhw4gPj4eMydO1frBRIRERHpI41D1sCBA+Hk5ISZM2di7dq1AP4ep5WQkID27dtrvUAiIiIifaRRyCoqKsLkyZPRu3dv7N+/X6qaiIiIiPSeRmOyjIyMMH36dBQVFUlVDxEREVGloPHA9+DgYOzZs0eKWoiIiIgqDY3HZIWFhWHUqFE4ceIE/Pz8YGFhoTI9IiJCa8URERER6SuNL0ZqYFD2wS+ZTIbi4uLXLqqy4MVIiYiI9I+2vr81PpKlUCgqvDIiIiKid4VGY7IKCwthZGSEkydPSlUPERERUaWgUciqUqUKXF1deUqQiIiI6BU0/nXhv//9b/zrX//CvXv3pKiHiIiIqFLQeEzW/PnzcfHiRTg7O8PNzU3t14WpqalaK46IiIhIX2kcsiIjIyUog4iIiKhy0fgSDu+yLVu2YNiwYVAoFBg5ciT69u370v68hAMREZH+0db3d7nHZB09evSlA96fPXumvGF0ZVRUVITY2Fjs2rULaWlpmDFjBu7evavrsoiIiOgtVe6QFRgYqBIqrK2tcfnyZeXzBw8eoGvXrtqt7i1y9OhR+Pj4wMXFBZaWlggLC8OOHTt0XRYRERG9pcodsl48q1jaWcbXPfM4depUyGQyDBky5LWW86K9e/ciPDwczs7OkMlk2LRpU6n94uLi4O7uDlNTUwQEBODo0aPKaTdv3oSLi4vyuYuLCzIzM7VaJxEREVUeGl/C4WVkMlmF501JScGiRYvQsGHDl/Y7cOAACgsL1dpPnz6N7OzsUufJz8+Hr68v4uLiylxuQkICYmNjMW7cOKSmpsLX1xehoaG4ffu2ZhtCREREBC2HrIrKy8tD9+7dsWTJErz33ntl9lMoFIiJiUG3bt1UxoedO3cOQUFBWLFiRanzhYWFYeLEiejQoUOZy541axb69euH6OhoeHt7Y+HChTA3N8eyZcsAAM7OzipHrjIzM+Hs7KzpphIREdE7QqOQdfr0afz111/466+/IITA2bNnlc9PnTpV4SJiYmLQrl07hISEvLxYAwP88ccfSEtLQ8+ePaFQKHDp0iUEBQUhMjISI0aMqND6CwoKcPz4cZX1GxgYICQkBIcOHQIA+Pv74+TJk8jMzEReXh62bt2K0NDQUpcXFxcHb29vNG3atEL1EBERkf7T6DpZwcHBKuOuPv30UwB/nyYUQlTodOHPP/+M1NRUpKSklKu/s7Mzdu3ahebNm6Nbt244dOgQQkJCsGDBAo3XXSInJwfFxcVwdHRUaXd0dMTZs2cBAEZGRpg5cyZat24NhUKBESNGwM7OrtTlxcTEICYmRvkTUCIiInr3lDtkZWRkaH3l169fxzfffIPExESYmpqWez5XV1esXLkSLVu2RK1atfDDDz+81niw8oqIiEBERITk6yEiIiL9V+6Q5ebmpvWVHz9+HLdv38b777+vbCsuLsbevXsxf/58PHv2DIaGhmrzZWdno3///ggPD0dKSgqGDh2KefPmVbgOe3t7GBoaqg2cz87OhpOTU4WXS0RERO8unQ58Dw4OxokTJ5Cenq58NGnSBN27d0d6enqpASsnJwfBwcHw8vLChg0bkJSUhISEBAwfPrzCdRgbG8PPzw9JSUnKNoVCgaSkJAQGBlZ4uURERPTu0vjehdpkZWWF+vXrq7RZWFjAzs5OrR34O/iEhYXBzc0NCQkJMDIygre3NxITExEUFAQXFxcMHTpUbb68vDxcvHhR+TwjIwPp6emwtbWFq6srACA2NhZRUVFo0qQJ/P39MWfOHOTn5yM6OlrLW01ERETvAp2GLE0ZGBhg8uTJaN68OYyNjZXtvr6+2LlzJxwcHEqd79ixY2jdurXyeWxsLAAgKioK8fHxAIAuXbrgzp07GDt2LLKystCoUSNs27ZNbTA8ERERUXnwBtES4g2iiYiI9M8bv0E0EREREZVfuU4XNm7cuNyXSEhNTX2tgoiIiIgqg3KFrMjISOW/nz59iu+//x7e3t7KX94dPnwYp06dwldffSVJkURERET6plwha9y4ccp/9+3bF4MHD8Z//vMftT7Xr1/XbnVEREREekrjge9yuRzHjh2Dp6enSvuFCxfQpEkT5ObmarVAfcaB70RERPpHZwPfzczMcODAAbX2AwcOaHRrHCIiIqLKTOPrZA0ZMgQDBw5Eamoq/P39AQBHjhzBsmXLMGbMGK0XSERERKSPNA5Zo0aNQq1atTB37lz89NNPAAAvLy8sX74cnTt31nqBRERERPqIFyOVEMdkERER6R+dXoz0wYMHWLp0Kf71r3/h3r17AP6+PlZmZmaFCyEiIiKqTDQ+XfjXX38hJCQEcrkcV65cQd++fWFra4sNGzbg2rVr+PHHH6Wok4iIiEivaHwkKzY2Fr169cKFCxdUfk34ySefYO/evVotjoiIiEhfaRyyUlJSMGDAALV2FxcXZGVlaaUoIiIiIn2nccgyMTHBw4cP1drPnz8PBwcHrRRFREREpO80DlkRERH49ttvUVhYCACQyWS4du0aRo4ciU6dOmm9QCIiIiJ9pHHImjlzJvLy8lC1alU8efIELVu2RO3atWFlZYVJkyZJUSMRERGR3tH414VyuRyJiYk4cOAA/vzzT+Tl5eH9999HSEiIFPURERER6SWNQlZhYSHMzMyQnp6OZs2aoVmzZlLVRURERKTXNDpdWKVKFbi6uqK4uFiqeoiIiIgqBY3HZP373/9WudI7EREREanTeEzW/PnzcfHiRTg7O8PNzQ0WFhYq01NTU7VWHBEREZG+0jhkRUZGSlAGERERUeUiE0IIXRdRWWnrLt5ERET05mjr+1vjMVlERERE9Goany4sLi7G7NmzsXbtWly7dg0FBQUq0zkgnoiIiKgCR7ImTJiAWbNmoUuXLsjNzUVsbCw6duwIAwMDjB8/XoISiYiIiPSPxiFr1apVWLJkCYYNGwYjIyN07doVS5cuxdixY3H48GEpaiQiIiLSOxqHrKysLDRo0AAAYGlpidzcXADAp59+it9//1271RERERHpKY1DVvXq1XHr1i0AgIeHB3bs2AEASElJgYmJiXarIyIiItJTGoesDh06ICkpCQDw9ddfY8yYMfD09ETPnj3Ru3dvrRdIREREpI9e+zpZhw4dwqFDh+Dp6Ynw8HBt1VUp8DpZRERE+kdb398aX8LhRYGBgQgMDHzdxRARERFVKhqHrB9//PGl03v27FnhYt52W7ZswbBhw6BQKDBy5Ej07dtX1yURERHRW0rj04XvvfeeyvPCwkI8fvwYxsbGMDc3r7QXIy0qKoK3tzeSk5Mhl8vh5+eHgwcPws7Orsx5eLqQiIhI/+jstjr3799XeeTl5eHcuXP46KOPsGbNmgoX8rY7evQofHx84OLiAktLS4SFhSl/WUlERET0Iq3cu9DT0xNTp07FN998o/G8CxYsQMOGDWFtbQ1ra2sEBgZi69at2ihLae/evQgPD4ezszNkMhk2bdpUar+4uDi4u7vD1NQUAQEBOHr0qHLazZs34eLionzu4uKCzMxMrdZJRERElYfWbhBtZGSEmzdvajxf9erVMXXqVBw/fhzHjh1DUFAQ2rdvj1OnTpXa/8CBAygsLFRrP336NLKzs0udJz8/H76+voiLiyuzjoSEBMTGxmLcuHFITU2Fr68vQkNDcfv2bY23iYiIiEjjge+//vqrynMhBG7duoX58+ejWbNmGhfw4mUfJk2ahAULFuDw4cPw8fFRmaZQKBATEwNPT0/8/PPPMDQ0BACcO3cOQUFBiI2NxYgRI9TWERYWhrCwsJfWMWvWLPTr1w/R0dEAgIULF+L333/HsmXLMGrUKDg7O6scucrMzIS/v7/G20tERETvBo1DVmRkpMpzmUwGBwcHBAUFYebMma9VTHFxMdatW4f8/PxSLwthYGCAP/74Ay1atEDPnj2xcuVKZGRkICgoCJGRkaUGrPIoKCjA8ePHMXr0aJV1hYSE4NChQwAAf39/nDx5EpmZmZDL5di6dSvGjBlT6vLi4uIQFxeH4uLiCtVDRERE+k/jkKVQKLRexIkTJxAYGIinT5/C0tISGzduhLe3d6l9nZ2dsWvXLjRv3hzdunXDoUOHEBISggULFlR4/Tk5OSguLoajo6NKu6OjI86ePQvg79OhM2fOROvWraFQKDBixIgyf1kYExODmJgY5a8TiIiI6N3z2hcj1Ya6desiPT0dubm5+OWXXxAVFYU9e/aUGbRcXV2xcuVKtGzZErVq1cIPP/wAmUwmeZ0RERGIiIiQfD1ERESk/zQOWbGxseXuO2vWrHL1MzY2Ru3atQEAfn5+SElJwdy5c7Fo0aJS+2dnZ6N///4IDw9HSkoKhg4dinnz5pW7rhfZ29vD0NBQbeB8dnY2nJycKrxcIiIiendpHLLS0tKQlpaGwsJC1K1bFwBw/vx5GBoa4v3331f2e50jSwqFAs+ePSt1Wk5ODoKDg+Hl5YV169bh/PnzaNWqFUxMTPDf//63QuszNjaGn58fkpKSlGPOFAoFkpKSMGjQoIpuBhEREb3DNA5Z4eHhsLKywooVK5RXf79//z6io6PRvHlzDBs2TKPljR49GmFhYXB1dcWjR4+wevVq7N69G9u3b1frq1AoEBYWBjc3NyQkJMDIyAje3t5ITExEUFAQXFxcMHToULX58vLycPHiReXzjIwMpKenw9bWFq6urgD+PkIXFRWFJk2awN/fH3PmzEF+fr7y14ZEREREGhEacnZ2FidPnlRrP3HihKhWrZqmixO9e/cWbm5uwtjYWDg4OIjg4GCxY8eOMvvv2LFDPHnyRK09NTVVXL9+vdR5kpOTBQC1R1RUlEq/efPmCVdXV2FsbCz8/f3F4cOHNd6e5+Xm5goAIjc397WWQ0RERG+Otr6/Nb53oZWVFX777Te0atVKpT05ORkRERF49OiRVsJfZcB7FxIREekfnd27sEOHDoiOjsaGDRtw48YN3LhxA+vXr0efPn3QsWPHChdCREREVJloPCZr4cKFGD58OLp166a8vY2RkRH69OmDGTNmaL1AIiIiIn2k8enCEvn5+bh06RIAwMPDAxYWFlotrDLg6UIiIiL9o7PThSUsLCzQsGFDyOVyXL16VZIrwRMRERHpq3KHrGXLlqldXLR///6oVasWGjRogPr16+P69etaL5CIiIhIH5U7ZC1evFh5XSwA2LZtG5YvX44ff/wRKSkpsLGxwYQJEyQpkoiIiEjflHvg+4ULF9CkSRPl882bN6N9+/bo3r07AGDy5Mm8cCcRERHR/5T7SNaTJ09UBn8dPHgQLVq0UD6vVasWsrKytFsdERERkZ4qd8hyc3PD8ePHAfx9/8BTp06hWbNmyulZWVmQy+Xar5CIiIhID5X7dGFUVBRiYmJw6tQp7Nq1C/Xq1YOfn59y+sGDB1G/fn1JiiQiIiLSN+UOWSNGjMDjx4+xYcMGODk5Yd26dSrTDxw4gK5du2q9QCIiIiJ9VOGLkdKr8WKkRERE+kfnFyMlIiIiorIxZBERERFJgCGLiIiISAIMWUREREQSYMgiIiIikkC5L+FQori4GPHx8UhKSsLt27ehUChUpu/atUtrxRERERHpK41D1jfffIP4+Hi0a9cO9evXh0wmk6IuIiIiIr2mccj6+eefsXbtWnzyySdS1ENERERUKWg8JsvY2Bi1a9eWohYiIiKiSkPjkDVs2DDMnTsXvFA8ERERUdk0Pl24f/9+JCcnY+vWrfDx8UGVKlVUpm/YsEFrxRERERHpK41Dlo2NDTp06CBFLURERESVhsYha/ny5VLUQURERFSp8GKkRERERBLQ+EgWAPzyyy9Yu3Ytrl27hoKCApVpqampWimMiIiISJ9pfCTru+++Q3R0NBwdHZGWlgZ/f3/Y2dnh8uXLCAsLk6JGIiIiIr2jccj6/vvvsXjxYsybNw/GxsYYMWIEEhMTMXjwYOTm5kpRIxEREZHe0ThkXbt2DR9++CEAwMzMDI8ePQIA9OjRA2vWrNFudURERER6SuOQ5eTkhHv37gEAXF1dcfjwYQBARkYGL1BKRERE9D8ah6ygoCD8+uuvAIDo6GgMHToUH3/8Mbp06cLrZxERERH9j0xoePhJoVBAoVDAyOjvHyb+/PPPOHjwIDw9PTFgwAAYGxtLUqg+evjwIeRyOXJzc2Ftba3rcoiIiKgctPX9rXHIovJjyCIiItI/2vr+rtDFSPft24cvvvgCgYGByMzMBACsXLkS+/fvr3AhRERERJWJxiFr/fr1CA0NhZmZGdLS0vDs2TMAQG5uLiZPnqz1AomIiIj0kcYha+LEiVi4cCGWLFmCKlWqKNubNWvGq70TERER/Y/GIevcuXNo0aKFWrtcLseDBw+0URMRERGR3qvQdbIuXryo1r5//37UqlVLK0URERER6TuNQ1a/fv3wzTff4MiRI5DJZLh58yZWrVqF4cOHY+DAgVLUSERERKR3jDSdYdSoUVAoFAgODsbjx4/RokULmJiYYPjw4fj666+lqJGIiIhI71T4OlkFBQW4ePEi8vLy4O3tDUtLS23Xpvd4nSwiIiL9o63vb42PZJUwNjaGt7d3hVdMREREVJmVO2T17t27XP2WLVtW4WKIiIiIKotyh6z4+Hi4ubmhcePG4J14iIiIiF6u3CFr4MCBWLNmDTIyMhAdHY0vvvgCtra2UtZGREREpLfKfQmHuLg43Lp1CyNGjMBvv/2GGjVqoHPnzti+fTuPbBERERG9oMK/Lrx69Sri4+Px448/oqioCKdOneIvDF/AXxcSERHpH219f2t8MVLljAYGkMlkEEKguLi4wgUQERERVUYahaxnz55hzZo1+Pjjj1GnTh2cOHEC8+fPx7Vr13gUi4iIiOg55R74/tVXX+Hnn39GjRo10Lt3b6xZswb29vZS1kZERESkt8o9JsvAwACurq5o3LgxZDJZmf02bNigteLeNlu2bMGwYcOgUCgwcuRI9O3b96X9OSaLiIhI/7zxK7737NnzpeGqsisqKkJsbCySk5Mhl8vh5+eHDh06wM7OTtelERER0VtIo4uRvsuOHj0KHx8fuLi4AADCwsKwY8cOdO3aVceVERER0duowr8u1JYpU6agadOmsLKyQtWqVREZGYlz585pdR179+5FeHg4nJ2dIZPJsGnTplL7xcXFwd3dHaampggICMDRo0eV027evKkMWADg4uKCzMxMrdZJRERElYfOQ9aePXsQExODw4cPIzExEYWFhWjTpg3y8/NL7X/gwAEUFhaqtZ8+fRrZ2dmlzpOfnw9fX1/ExcWVWUdCQgJiY2Mxbtw4pKamwtfXF6Ghobh9+3bFNoyIiIjeaToPWdu2bUOvXr3g4+MDX19fxMfH49q1azh+/LhaX4VCgZiYGHTr1k3l2lznzp1DUFAQVqxYUeo6wsLCMHHiRHTo0KHMOmbNmoV+/fohOjoa3t7eWLhwIczNzZU3vHZ2dlY5cpWZmQlnZ+eKbjYRERFVcjoPWS/Kzc0FgFLvi2hgYIA//vgDaWlp6NmzJxQKBS5duoSgoCBERkZixIgRFVpnQUEBjh8/jpCQEJV1hYSE4NChQwAAf39/nDx5EpmZmcjLy8PWrVsRGhpa6vLi4uLg7e2Npk2bVqgeIiIi0n9vVchSKBQYMmQImjVrhvr165fax9nZGbt27cL+/fvRrVs3BAUFISQkBAsWLKjwenNyclBcXAxHR0eVdkdHR2RlZQEAjIyMMHPmTLRu3RqNGjXCsGHDyvxlYUxMDE6fPo2UlJQK10RERET6rdy/LnwTYmJicPLkSezfv/+l/VxdXbFy5Uq0bNkStWrVwg8//PBGLi8RERGBiIgIyddDRERE+u+tOZI1aNAgbNmyBcnJyahevfpL+2ZnZ6N///4IDw/H48ePMXTo0Ndat729PQwNDdUGzmdnZ8PJyem1lk1ERETvJp2HLCEEBg0ahI0bN2LXrl2oWbPmS/vn5OQgODgYXl5e2LBhA5KSkpCQkIDhw4dXuAZjY2P4+fkhKSlJ2aZQKJCUlITAwMAKL5eIiIjeXTo/XRgTE4PVq1dj8+bNsLKyUo6BksvlMDMzU+mrUCgQFhYGNzc3JCQkwMjICN7e3khMTERQUBBcXFxKPaqVl5eHixcvKp9nZGQgPT0dtra2cHV1BQDExsYiKioKTZo0gb+/P+bMmYP8/HxER0dLuPVERERUWZX73oWSFVDGWKrly5ejV69eau2JiYlo3rw5TE1NVdrT0tLg4OBQ6qnG3bt3o3Xr1mrtUVFRKleynz9/PmbMmIGsrCw0atQI3333HQICAjTboOfw3oVERET6R1vf3zoPWZUZQxYREZH+0db3t87HZBERERFVRgxZRERERBJgyCIiIiKSAEMWERERkQQYsoiIiIgkwJBFREREJAGGLCIiIiIJMGQRERERSYAhi4iIiEgCDFlEREREEmDIIiIiIpIAQxYRERGRBBiyiIiIiCTAkEVEREQkAYYsIiIiIgkwZBERERFJgCGLiIiISAIMWUREREQSYMgiIiIikgBDFhEREZEEGLKIiIiIJMCQRURERCQBhiwiIiIiCTBkEREREUmAIYuIiIhIAgxZRERERBJgyCIiIiKSAEMWERERkQQYsoiIiIgkwJBFREREJAGGLCIiIiIJMGQRERERSYAhi4iIiEgCDFlEREREEmDIIiIiIpIAQxYRERGRBBiyiIiIiCTAkEVEREQkAYYsIiIiIgkwZBERERFJgCGLiIiISAIMWUREREQSYMgiIiIikgBDFhEREZEEGLKIiIiIJMCQRURERCQBhiwiIiIiCTBkEREREUmAIYuIiIhIAgxZRERERBJgyCIiIiKSAEMWERERkQQYsoiIiIgkwJBFREREJAGGLCIiIiIJMGQRERERSYAhi4iIiEgCDFlEREREEmDIIiIiIpIAQxYRERGRBBiyiIiIiCTAkEVEREQkAYYsIiIiIgkwZBERERFJgCGLiIiISAIMWUREREQSYMgiIiIikgBDFhEREZEEGLKIiIiIJMCQRURERCQBhiwiIiIiCTBkEREREUmAIYuIiIhIAgxZRERERBJgyCIiIiKSAEMWERERkQQYsoiIiIgkwJBFREREJAGGLCIiIiIJMGQRERERSYAhi4iIiEgCDFlEREREEmDIIiIiIpIAQxYRERGRBBiyiIiIiCTAkEVEREQkAYYsIiIiIgkwZBERERFJgCGLiIiISAIMWUREREQSYMgiIiIikgBDFhEREZEEGLKIiIiIJMCQRURERCQBhiwiIiIiCTBkEREREUmAIYuIiIhIAgxZRERERBJgyHqFLVu2oG7duvD09MTSpUt1XQ4RERHpCSNdF/A2KyoqQmxsLJKTkyGXy+Hn54cOHTrAzs5O16URERHRW45Hsl7i6NGj8PHxgYuLCywtLREWFoYdO3bouiwiIiLSA5U6ZO3duxfh4eFwdnaGTCbDpk2b1PrExcXB3d0dpqamCAgIwNGjR5XTbt68CRcXF+VzFxcXZGZmvonSiYiISM9V6pCVn58PX19fxMXFlTo9ISEBsbGxGDduHFJTU+Hr64vQ0FDcvn37DVdKRERElU2lHpMVFhaGsLCwMqfPmjUL/fr1Q3R0NABg4cKF+P3337Fs2TKMGjUKzs7OKkeuMjMz4e/vX+bynj17hmfPnimf5+bmAgAePnz4uptCREREb0jJ97YQ4vUWJN4RAMTGjRuVz589eyYMDQ1V2oQQomfPniIiIkIIIURhYaGoXbu2uHHjhnj06JGoU6eOyMnJKXMd48aNEwD44IMPPvjgg49K8Lh+/fprZY9KfSTrZXJyclBcXAxHR0eVdkdHR5w9exYAYGRkhJkzZ6J169ZQKBQYMWLES39ZOHr0aMTGxiqfKxQK3Lt3D3Z2dpDJZK9d88OHD1GjRg1cv34d1tbWr728txm3tXJ6V7b1XdlOgNtaGb0r2wmUva1CCDx69AjOzs6vtfx3NmSVV0REBCIiIsrV18TEBCYmJiptNjY2Wq/J2tq60v/hl+C2Vk7vyra+K9sJcFsro3dlO4HSt1Uul7/2civ1wPeXsbe3h6GhIbKzs1Xas7Oz4eTkpKOqiIiIqLJ4Z0OWsbEx/Pz8kJSUpGxTKBRISkpCYGCgDisjIiKiyqBSny7My8vDxYsXlc8zMjKQnp4OW1tbuLq6IjY2FlFRUWjSpAn8/f0xZ84c5OfnK39t+LYxMTHBuHHj1E5JVkbc1srpXdnWd2U7AW5rZfSubCcg/bbK/vfLu0pp9+7daN26tVp7VFQU4uPjAQDz58/HjBkzkJWVhUaNGuG7775DQEDAG66UiIiIKptKHbKIiIiIdOWdHZNFREREJCWGLCIiIiIJMGQRERERSYAhS4/ExcXB3d0dpqamCAgIwNGjR3Vd0muZMmUKmjZtCisrK1StWhWRkZE4d+6cSp9WrVpBJpOpPL788ksdVVxx48ePV9uOevXqKac/ffoUMTExsLOzg6WlJTp16qR2DTd94e7urratMpkMMTExAPR7n+7duxfh4eFwdnaGTCbDpk2bVKYLITB27FhUq1YNZmZmCAkJwYULF1T63Lt3D927d4e1tTVsbGzQp08f5OXlvcGteLWXbWdhYSFGjhyJBg0awMLCAs7OzujZsydu3rypsozS/g6mTp36hrfk1V61T3v16qW2HW3btlXpow/7FHj1tpb2vpXJZJgxY4ayjz7s1/J8t5TnM/fatWto164dzM3NUbVqVfzzn/9EUVGRRrUwZOmJhIQExMbGYty4cUhNTYWvry9CQ0Nx+/ZtXZdWYXv27EFMTAwOHz6MxMREFBYWok2bNsjPz1fp169fP9y6dUv5mD59uo4qfj0+Pj4q27F//37ltKFDh+K3337DunXrsGfPHty8eRMdO3bUYbUVl5KSorKdiYmJAIDPPvtM2Udf92l+fj58fX0RFxdX6vTp06fju+++w8KFC3HkyBFYWFggNDQUT58+Vfbp3r07Tp06hcTERGzZsgV79+5F//7939QmlMvLtvPx48dITU3FmDFjkJqaig0bNuDcuXOl3hnj22+/VdnPX3/99ZsoXyOv2qcA0LZtW5XtWLNmjcp0fdinwKu39fltvHXrFpYtWwaZTIZOnTqp9Hvb92t5vlte9ZlbXFyMdu3aoaCgAAcPHsSKFSsQHx+PsWPHalbMa935kN4Yf39/ERMTo3xeXFwsnJ2dxZQpU3RYlXbdvn1bABB79uxRtrVs2VJ88803uitKS8aNGyd8fX1LnfbgwQNRpUoVsW7dOmXbmTNnBABx6NChN1ShdL755hvh4eEhFAqFEKLy7FNA9abzCoVCODk5iRkzZijbHjx4IExMTMSaNWuEEEKcPn1aABApKSnKPlu3bhUymUxkZma+sdo18eJ2lubo0aMCgLh69aqyzc3NTcyePVva4rSstG2NiooS7du3L3MefdynQpRvv7Zv314EBQWptOnjfn3xu6U8n7l//PGHMDAwEFlZWco+CxYsENbW1uLZs2flXjePZOmBgoICHD9+HCEhIco2AwMDhISE4NChQzqsTLtyc3MBALa2tirtq1atgr29PerXr4/Ro0fj8ePHuijvtV24cAHOzs6oVasWunfvjmvXrgEAjh8/jsLCQpX9W69ePbi6uur9/i0oKMBPP/2E3r17q9wkvbLs0+dlZGQgKytLZT/K5XIEBAQo9+OhQ4dgY2ODJk2aKPuEhITAwMAAR44ceeM1a0tubi5kMpnavVqnTp0KOzs7NG7cGDNmzND4VMvbYvfu3ahatSrq1q2LgQMH4u7du8pplXWfZmdn4/fff0efPn3Upunbfn3xu6U8n7mHDh1CgwYN4OjoqOwTGhqKhw8f4tSpU+Ved6W+4ntlkZOTg+LiYpWdDQCOjo44e/asjqrSLoVCgSFDhqBZs2aoX7++sr1bt25wc3ODs7Mz/vrrL4wcORLnzp3Dhg0bdFit5gICAhAfH4+6devi1q1bmDBhApo3b46TJ08iKysLxsbGal9Qjo6OyMrK0k3BWrJp0yY8ePAAvXr1UrZVln36opJ9Vdr7tGRaVlYWqlatqjLdyMgItra2eruvnz59ipEjR6Jr164qN9gdPHgw3n//fdja2uLgwYMYPXo0bt26hVmzZumwWs21bdsWHTt2RM2aNXHp0iX861//QlhYGA4dOgRDQ8NKuU8BYMWKFbCyslIbtqBv+7W075byfOZmZWWV+l4umVZeDFn0VoiJicHJkydVxikBUBnX0KBBA1SrVg3BwcG4dOkSPDw83nSZFRYWFqb8d8OGDREQEAA3NzesXbsWZmZmOqxMWj/88APCwsLg7OysbKss+5T+HgTfuXNnCCGwYMEClWmxsbHKfzds2BDGxsYYMGAApkyZole3a/n888+V/27QoAEaNmwIDw8P7N69G8HBwTqsTFrLli1D9+7dYWpqqtKub/u1rO+WN4WnC/WAvb09DA0N1X75kJ2dDScnJx1VpT2DBg3Cli1bkJycjOrVq7+0b8ktj56/J6U+srGxQZ06dXDx4kU4OTmhoKAADx48UOmj7/v36tWr2LlzJ/r27fvSfpVln5bsq5e9T52cnNR+rFJUVIR79+7p3b4uCVhXr15FYmKiylGs0gQEBKCoqAhXrlx5MwVKpFatWrC3t1f+vVamfVpi3759OHfu3Cvfu8DbvV/L+m4pz2euk5NTqe/lkmnlxZClB4yNjeHn54ekpCRlm0KhQFJSEgIDA3VY2esRQmDQoEHYuHEjdu3ahZo1a75ynvT0dABAtWrVJK5OWnl5ebh06RKqVasGPz8/VKlSRWX/njt3DteuXdPr/bt8+XJUrVoV7dq1e2m/yrJPa9asCScnJ5X9+PDhQxw5ckS5HwMDA/HgwQMcP35c2WfXrl1QKBR6dc/UkoB14cIF7Ny5E3Z2dq+cJz09HQYGBmqn1vTNjRs3cPfuXeXfa2XZp8/74Ycf4OfnB19f31f2fRv366u+W8rzmRsYGIgTJ06oBOiS/0x4e3trVAzpgZ9//lmYmJiI+Ph4cfr0adG/f39hY2Oj8ssHfTNw4EAhl8vF7t27xa1bt5SPx48fCyGEuHjxovj222/FsWPHREZGhti8ebOoVauWaNGihY4r19ywYcPE7t27RUZGhjhw4IAICQkR9vb24vbt20IIIb788kvh6uoqdu3aJY4dOyYCAwNFYGCgjquuuOLiYuHq6ipGjhyp0q7v+/TRo0ciLS1NpKWlCQBi1qxZIi0tTfmruqlTpwobGxuxefNm8ddff4n27duLmjVriidPniiX0bZtW9G4cWNx5MgRsX//fuHp6Sm6du2qq00q1cu2s6CgQERERIjq1auL9PR0lfduya+uDh48KGbPni3S09PFpUuXxE8//SQcHBxEz549dbxl6l62rY8ePRLDhw8Xhw4dEhkZGWLnzp3i/fffF56enuLp06fKZejDPhXi1X+/QgiRm5srzM3NxYIFC9Tm15f9+qrvFiFe/ZlbVFQk6tevL9q0aSPS09PFtm3bhIODgxg9erRGtTBk6ZF58+YJV1dXYWxsLPz9/cXhw4d1XdJrAVDqY/ny5UIIIa5duyZatGghbG1thYmJiahdu7b45z//KXJzc3VbeAV06dJFVKtWTRgbGwsXFxfRpUsXcfHiReX0J0+eiK+++kq89957wtzcXHTo0EHcunVLhxW/nu3btwsA4ty5cyrt+r5Pk5OTS/2bjYqKEkL8fRmHMWPGCEdHR2FiYiKCg4PVXoO7d++Krl27CktLS2FtbS2io6PFo0ePdLA1ZXvZdmZkZJT53k1OThZCCHH8+HEREBAg5HK5MDU1FV5eXmLy5MkqweRt8bJtffz4sWjTpo1wcHAQVapUEW5ubqJfv35q/7nVh30qxKv/foUQYtGiRcLMzEw8ePBAbX592a+v+m4RonyfuVeuXBFhYWHCzMxM2Nvbi2HDhonCwkKNapH9ryAiIiIi0iKOySIiIiKSAEMWERERkQQYsoiIiIgkwJBFREREJAGGLCIiIiIJMGQRERERSYAhi4iIiEgCDFlERFrm7u6OOXPm6LoMItIxhiwi0mu9evVCZGQkAKBVq1YYMmTIG1t3fHw8bGxs1NpTUlLQv3//N1YHEb2djHRdABHR26agoADGxsYVnt/BwUGL1RCRvuKRLCKqFHr16oU9e/Zg7ty5kMlkkMlkuHLlCgDg5MmTCAsLg6WlJRwdHdGjRw/k5OQo523VqhUGDRqEIUOGwN7eHqGhoQCAWbNmoUGDBrCwsECNGjXw1VdfIS8vDwCwe/duREdHIzc3V7m+8ePHA1A/XXjt2jW0b98elpaWsLa2RufOnZGdna2cPn78eDRq1AgrV66Eu7s75HI5Pv/8czx69EjZ55dffkGDBg1gZmYGOzs7hISEID8/X6JXk4i0gSGLiCqFuXPnIjAwEP369cOtW7dw69Yt1KhRAw8ePEBQUBAaN26MY8eOYdu2bcjOzkbnzp1V5l+xYgWMjY1x4MABLFy4EABgYGCA7777DqdOncKKFSuwa9cujBgxAgDw4YcfYs6cObC2tlaub/jw4Wp1KRQKtG/fHvfu3cOePXuQmJiIy5cvo0uXLir9Ll26hE2bNmHLli3YsmUL9uzZg6lTpwIAbt26ha5du6J37944c+YMdu/ejY4dO4K3niV6u/F0IRFVCnK5HMbGxjA3N4eTk5Oyff78+WjcuDEmT56sbFu2bBlq1KiB8+fPo06dOgAAT09PTJ8+XWWZz4/vcnd3x8SJE/Hll1/i+++/h7GxMeRyOWQymcr6XpSUlIQTJ04gIyMDNWrUAAD8+OOP8PHxQUpKCpo2bQrg7zAWHx8PKysrAECPHj2QlJSESZMm4datWygqKkLHjh3h5uYGAGjQoMFrvFpE9CbwSBYRVWp//vknkpOTYWlpqXzUq1cPwN9Hj0r4+fmpzbtz504EBwfDxcUFVlZW6NGjB+7evYvHjx+Xe/1nzpxBjRo1lAELALy9vWFjY4MzZ84o29zd3ZUBCwCqVauG27dvAwB8fX0RHByMBg0a4LPPPsOSJUtw//798r8IRKQTDFlEVKnl5eUhPDwc6enpKo8LFy6gRYsWyn4WFhYq8125cgWffvopGjZsiPXr1+P48eOIi4sD8PfAeG2rUqWKynOZTAaFQgEAMDQ0RGJiIrZu3Qpvb2/MmzcPdevWRUZGhtbrICLtYcgiokrD2NgYxcXFKm3vv/8+Tp06BXd3d9SuXVvl8WKwet7x48ehUCgwc+ZMfPDBB6hTpw5u3rz5yvW9yMvLC9evX8f169eVbadPn8aDBw/g7e1d7m2TyWRo1qwZJkyYgLS0NBgbG2Pjxo3lnp+I3jyGLCKqNNzd3XHkyBFcuXIFOTk5UCgUiImJwb1799C1a1ekpKTg0qVL2L59O6Kjo18akGrXro3CwkLMmzcPly9fxsqVK5UD4p9fX15eHpKSkpCTk1PqacSQkBA0aNAA3bt3R2pqKo4ePYqePXuiZcuWaNKkSbm268iRI5g8eTKOHTuGa9euYcOGDbhz5w68vLw0e4GI6I1iyCKiSmP48OEwNDSEt7c3HBwccO3aNTg7O+PAgQMoLi5GmzZt0KBBAwwZMgQ2NjYwMCj7I9DX1xezZs3CtGnTUL9+faxatQpTpkxR6fPhhx/iyy+/RJcuXeDg4KA2cB74+wjU5s2b8d5776FFixYICQlBrVq1kJCQUO7tsra2xt69e/HJJ5+gTp06+L//+z/MnDkTYWFh5X9xiOiNkwn+BpiIiIhI63gki4iIiEgCDFlEREREEmDIIiIiIpIAQxYRERGRBBiyiIiIiCTAkEVEREQkAYYsIiIiIgkwZBERERFJgCGLiIiISAIMWUREREQSYMgiIiIikgBDFhEREZEE/h+3Z+XdHvYesgAAAABJRU5ErkJggg==", "text/plain": [ "
" ] }, "metadata": {}, "output_type": "display_data" }, { "name": "stdout", "output_type": "stream", "text": [ "Test MSE: nan\n", "Standard Error of MSE: nan\n" ] } ], "source": [ "# Set the learning rate and run the model\n", "alpha = 0.1\n", "run(alpha)" ] } ], "metadata": { "kernelspec": { "display_name": "Python 3", "language": "python", "name": "python3" }, "language_info": { "codemirror_mode": { "name": "ipython", "version": 3 }, "file_extension": ".py", "mimetype": "text/x-python", "name": "python", "nbconvert_exporter": "python", "pygments_lexer": "ipython3", "version": "3.11.7" } }, "nbformat": 4, "nbformat_minor": 2 }